Apr 17 07:49:44.803983 ip-10-0-141-224 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:49:44.803995 ip-10-0-141-224 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:49:44.804005 ip-10-0-141-224 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:49:44.804232 ip-10-0-141-224 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:49:54.849370 ip-10-0-141-224 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:49:54.849387 ip-10-0-141-224 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 279d015b126a4e399646f8e73cd021b5 -- Apr 17 07:52:21.385296 ip-10-0-141-224 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:52:21.858222 ip-10-0-141-224 kubenswrapper[2560]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:21.858222 ip-10-0-141-224 kubenswrapper[2560]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:52:21.858222 ip-10-0-141-224 kubenswrapper[2560]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:21.858222 ip-10-0-141-224 kubenswrapper[2560]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:52:21.858222 ip-10-0-141-224 kubenswrapper[2560]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:21.859584 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.859495 2560 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:52:21.865192 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865174 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:21.865192 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865192 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865195 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865199 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865202 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865205 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865208 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865211 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865213 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865216 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865219 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865222 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865224 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865227 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865230 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865233 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865236 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865239 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865241 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865244 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865247 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:21.865263 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865251 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865255 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865258 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865262 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865265 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865269 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865272 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865275 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865278 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865281 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865284 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865287 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865290 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865293 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865295 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865298 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865301 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865303 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865306 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865332 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:21.865766 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865336 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865339 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865342 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865345 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865348 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865350 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865353 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865356 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865358 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865361 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865363 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865366 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865369 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865372 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865374 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865377 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865381 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865383 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865386 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:21.866526 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865389 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865393 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865397 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865400 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865404 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865407 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865409 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865412 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865415 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865417 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865420 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865422 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865425 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865427 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865430 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865433 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865436 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865438 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865441 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:21.867114 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865443 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865446 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865448 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865451 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865453 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865456 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.865458 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867258 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867275 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867280 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867283 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867286 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867289 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867292 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867295 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867298 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867302 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867304 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867307 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867310 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:21.867567 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867313 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867316 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867319 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867322 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867325 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867328 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867330 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867333 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867336 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867338 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867341 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867344 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867346 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867349 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867351 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867354 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867357 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867359 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867363 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867366 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:21.868057 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867369 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867374 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867377 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867380 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867382 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867385 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867388 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867390 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867393 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867396 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867399 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867402 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867404 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867407 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867410 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867413 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867416 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867419 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867421 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:21.868588 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867424 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867427 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867429 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867432 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867435 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867437 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867440 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867443 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867446 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867449 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867451 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867454 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867457 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867460 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867463 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867466 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867469 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867471 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867475 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867478 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:21.869070 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867480 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867483 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867486 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867489 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867491 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867495 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867500 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867503 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867506 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867508 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867511 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867514 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867516 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.867519 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867601 2560 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867611 2560 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867621 2560 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867629 2560 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867634 2560 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867637 2560 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867642 2560 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:52:21.869561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867647 2560 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867650 2560 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867653 2560 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867657 2560 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867661 2560 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867664 2560 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867667 2560 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867670 2560 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867673 2560 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867676 2560 flags.go:64] FLAG: --cloud-config="" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867679 2560 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867682 2560 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867687 2560 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867690 2560 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867694 2560 flags.go:64] FLAG: --config-dir="" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867697 2560 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867700 2560 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867704 2560 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867708 2560 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867712 2560 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867715 2560 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867718 2560 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867722 2560 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867724 2560 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867728 2560 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:52:21.870103 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867731 2560 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867735 2560 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867738 2560 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867741 2560 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867744 2560 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867747 2560 flags.go:64] FLAG: --enable-server="true" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867750 2560 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867755 2560 flags.go:64] FLAG: --event-burst="100" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867759 2560 flags.go:64] FLAG: --event-qps="50" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867762 2560 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867765 2560 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867768 2560 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867772 2560 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867775 2560 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867778 2560 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867781 2560 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867784 2560 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867787 2560 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867790 2560 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867793 2560 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867797 2560 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867800 2560 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867802 2560 flags.go:64] FLAG: --feature-gates="" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867806 2560 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867809 2560 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:52:21.870712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867813 2560 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867816 2560 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867820 2560 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867823 2560 flags.go:64] FLAG: --help="false" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867826 2560 flags.go:64] FLAG: --hostname-override="ip-10-0-141-224.ec2.internal" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867829 2560 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867832 2560 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867835 2560 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867838 2560 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867842 2560 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867844 2560 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867848 2560 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867850 2560 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867853 2560 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867856 2560 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867859 2560 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867862 2560 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867865 2560 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867868 2560 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867871 2560 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867874 2560 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867877 2560 flags.go:64] FLAG: --lock-file="" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867879 2560 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867882 2560 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:52:21.871332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867885 2560 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867891 2560 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867896 2560 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867899 2560 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867902 2560 flags.go:64] FLAG: --logging-format="text" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867905 2560 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867908 2560 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867911 2560 flags.go:64] FLAG: --manifest-url="" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867914 2560 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867919 2560 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867923 2560 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867927 2560 flags.go:64] FLAG: --max-pods="110" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867930 2560 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867933 2560 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867936 2560 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867939 2560 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867942 2560 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867945 2560 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867948 2560 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867956 2560 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867959 2560 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867962 2560 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867965 2560 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:52:21.871913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867969 2560 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867974 2560 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867978 2560 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867981 2560 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.867999 2560 flags.go:64] FLAG: --port="10250" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868003 2560 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868006 2560 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08982cd5e15897e97" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868009 2560 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868012 2560 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868015 2560 flags.go:64] FLAG: --register-node="true" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868019 2560 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868027 2560 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868031 2560 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868033 2560 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868036 2560 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868039 2560 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868043 2560 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868047 2560 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868050 2560 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868052 2560 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868056 2560 flags.go:64] FLAG: --runonce="false" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868059 2560 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868062 2560 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868065 2560 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868069 2560 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868072 2560 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:52:21.872478 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868075 2560 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868078 2560 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868082 2560 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868084 2560 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868087 2560 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868090 2560 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868093 2560 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868096 2560 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868099 2560 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868102 2560 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868108 2560 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868111 2560 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868114 2560 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868118 2560 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868121 2560 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868124 2560 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868127 2560 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868131 2560 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868134 2560 flags.go:64] FLAG: --v="2" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868138 2560 flags.go:64] FLAG: --version="false" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868142 2560 flags.go:64] FLAG: --vmodule="" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868147 2560 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.868150 2560 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868259 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:21.873100 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868262 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868266 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868269 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868272 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868275 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868278 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868280 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868283 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868286 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868288 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868290 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868296 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868299 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868301 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868304 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868306 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868309 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868312 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868315 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868318 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:21.873679 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868321 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868323 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868325 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868330 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868333 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868336 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868340 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868343 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868345 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868349 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868351 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868354 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868357 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868360 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868362 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868365 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868368 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868371 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868374 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868376 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:21.874243 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868379 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868382 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868384 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868388 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868391 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868394 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868397 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868399 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868402 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868404 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868407 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868409 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868412 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868415 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868418 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868420 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868423 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868426 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868428 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:21.874730 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868431 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868433 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868436 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868439 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868441 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868444 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868447 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868449 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868451 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868454 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868457 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868459 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868462 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868465 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868468 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868470 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868474 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868477 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868479 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:21.875228 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868482 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868486 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868490 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868492 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868495 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868497 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.868500 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.869205 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.875590 2560 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.875606 2560 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875654 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875658 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875662 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875665 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875668 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:21.875773 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875671 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875674 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875676 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875679 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875682 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875684 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875687 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875690 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875696 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875699 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875702 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875704 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875707 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875709 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875712 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875715 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875718 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875720 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875723 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875725 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:21.876173 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875728 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875731 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875733 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875736 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875738 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875741 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875744 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875747 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875750 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875752 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875755 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875758 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875760 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875763 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875765 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875768 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875771 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875773 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875776 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875779 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:21.876669 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875781 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875785 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875789 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875793 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875796 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875799 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875803 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875805 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875808 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875811 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875814 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875816 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875819 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875821 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875824 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875827 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875830 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875832 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875835 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:21.877174 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875838 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875841 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875844 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875846 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875849 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875851 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875854 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875857 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875859 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875863 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875867 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875869 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875872 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875875 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875877 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875881 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875883 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875886 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875888 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875891 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:21.877664 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875893 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.875896 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.875901 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876018 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876024 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876027 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876031 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876036 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876039 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876042 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876045 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876047 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876051 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876053 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876056 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:21.878246 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876058 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876061 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876064 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876066 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876069 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876071 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876074 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876077 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876079 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876081 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876084 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876086 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876089 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876092 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876096 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876099 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876102 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876104 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876107 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:21.878610 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876110 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876113 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876116 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876118 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876121 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876123 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876126 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876129 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876131 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876134 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876137 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876140 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876142 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876145 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876148 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876150 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876153 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876155 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876158 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876160 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:21.879083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876163 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876165 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876167 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876170 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876172 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876175 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876178 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876181 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876184 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876187 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876189 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876192 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876194 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876197 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876199 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876202 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876204 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876207 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876209 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:21.879570 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876212 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876214 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876217 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876219 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876222 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876224 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876226 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876229 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876232 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876234 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876237 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876239 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876242 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876244 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876246 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:21.876249 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:21.880047 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.876253 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:21.880443 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.876366 2560 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:52:21.880443 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.878548 2560 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:52:21.880443 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.879455 2560 server.go:1019] "Starting client certificate rotation" Apr 17 07:52:21.880443 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.879553 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:52:21.880443 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.880205 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:52:21.906458 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.906435 2560 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:52:21.910756 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.910740 2560 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:52:21.925628 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.925605 2560 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:52:21.933290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.933262 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:52:21.934362 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.934348 2560 log.go:25] "Validated CRI v1 image API" Apr 17 07:52:21.935699 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.935680 2560 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:52:21.940158 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.940137 2560 fs.go:135] Filesystem UUIDs: map[5367a4da-b42f-4e5a-8a15-48db058a19d9:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f83f7c81-689b-472c-9a68-5558da8ff9cf:/dev/nvme0n1p3] Apr 17 07:52:21.940233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.940157 2560 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:52:21.945137 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.945029 2560 manager.go:217] Machine: {Timestamp:2026-04-17 07:52:21.943873543 +0000 UTC m=+0.430995341 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100344 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2987462d175594f7b66dad9a571409 SystemUUID:ec298746-2d17-5594-f7b6-6dad9a571409 BootID:279d015b-126a-4e39-9646-f8e73cd021b5 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6c:a9:b3:23:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6c:a9:b3:23:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:ba:d8:b9:72:e3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:52:21.945137 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.945132 2560 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:52:21.945263 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.945251 2560 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:52:21.946893 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.946871 2560 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:52:21.947046 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.946895 2560 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-224.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:52:21.947094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.947059 2560 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:52:21.947094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.947067 2560 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:52:21.947094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.947080 2560 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:52:21.947208 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.947094 2560 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:52:21.948637 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.948627 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:52:21.948740 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.948730 2560 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:52:21.951588 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.951578 2560 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:52:21.951626 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.951592 2560 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:52:21.951626 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.951604 2560 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:52:21.951626 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.951613 2560 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:52:21.951718 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.951632 2560 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:52:21.952807 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.952664 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:52:21.952852 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.952816 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:52:21.954131 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.954114 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tk7xs" Apr 17 07:52:21.955825 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.955811 2560 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:52:21.957792 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.957775 2560 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:52:21.959835 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959820 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959840 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959850 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959857 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959866 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959874 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959883 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959892 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959903 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:52:21.959909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959913 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:52:21.960197 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959926 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:52:21.960197 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.959940 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:52:21.960740 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.960723 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tk7xs" Apr 17 07:52:21.960925 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.960913 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:52:21.960973 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.960947 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:52:21.961252 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:21.961227 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-224.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:52:21.961316 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:21.961248 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:52:21.964629 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.964614 2560 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:52:21.964706 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.964657 2560 server.go:1295] "Started kubelet" Apr 17 07:52:21.964765 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.964729 2560 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:52:21.964839 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.964800 2560 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:52:21.964878 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.964859 2560 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:52:21.965521 ip-10-0-141-224 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:52:21.966266 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.966227 2560 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:52:21.967333 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.967319 2560 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:52:21.973042 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973018 2560 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:52:21.973138 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973038 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:52:21.973702 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973684 2560 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:52:21.973768 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973704 2560 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:52:21.973768 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973720 2560 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:52:21.973842 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973829 2560 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:52:21.973842 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.973835 2560 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:52:21.973949 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:21.973930 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:21.974838 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:21.974819 2560 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:52:21.975014 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.974967 2560 factory.go:55] Registering systemd factory Apr 17 07:52:21.975014 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975008 2560 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:52:21.975199 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975176 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:21.975334 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975315 2560 factory.go:153] Registering CRI-O factory Apr 17 07:52:21.975334 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975333 2560 factory.go:223] Registration of the crio container factory successfully Apr 17 07:52:21.975457 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975386 2560 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:52:21.975457 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975404 2560 factory.go:103] Registering Raw factory Apr 17 07:52:21.975457 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.975417 2560 manager.go:1196] Started watching for new ooms in manager Apr 17 07:52:21.976402 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.976387 2560 manager.go:319] Starting recovery of all containers Apr 17 07:52:21.982388 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:21.982358 2560 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-224.ec2.internal\" not found" node="ip-10-0-141-224.ec2.internal" Apr 17 07:52:21.982595 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.982569 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-224.ec2.internal" not found Apr 17 07:52:21.986505 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.986490 2560 manager.go:324] Recovery completed Apr 17 07:52:21.990648 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.990635 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:21.996625 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.996592 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:21.996625 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.996624 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:21.996783 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.996635 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:21.997299 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.997284 2560 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:52:21.997299 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.997295 2560 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:52:21.997424 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.997313 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:52:21.998730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:21.998715 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-224.ec2.internal" not found Apr 17 07:52:22.000225 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.000213 2560 policy_none.go:49] "None policy: Start" Apr 17 07:52:22.000273 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.000230 2560 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:52:22.000273 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.000239 2560 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:52:22.039625 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.039600 2560 manager.go:341] "Starting Device Plugin manager" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.039651 2560 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.039666 2560 server.go:85] "Starting device plugin registration server" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.039947 2560 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.039957 2560 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.040507 2560 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.040594 2560 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.040606 2560 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.041039 2560 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:52:22.045882 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.041085 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.057876 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.057858 2560 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-224.ec2.internal" not found Apr 17 07:52:22.126561 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.126477 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:52:22.127683 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.127671 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:52:22.127743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.127697 2560 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:52:22.127743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.127720 2560 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:52:22.127743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.127727 2560 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:52:22.127882 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.127764 2560 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:52:22.132001 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.131967 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:22.140859 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.140844 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:22.141880 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.141859 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:22.141964 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.141893 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:22.141964 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.141905 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:22.141964 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.141928 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.150079 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.150064 2560 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.150132 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.150086 2560 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-224.ec2.internal\": node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.164364 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.164339 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.228381 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.228337 2560 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal"] Apr 17 07:52:22.228550 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.228439 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:22.230120 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.230105 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:22.230194 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.230135 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:22.230194 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.230147 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:22.231708 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.231696 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:22.231867 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.231854 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.231912 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.231883 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:22.233563 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.233544 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:22.233646 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.233574 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:22.233646 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.233545 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:22.233646 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.233604 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:22.233646 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.233616 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:22.233646 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.233585 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:22.234961 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.234941 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.235061 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.234966 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:22.235733 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.235717 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:22.235798 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.235748 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:22.235798 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.235763 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:22.247565 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.247547 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-224.ec2.internal\" not found" node="ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.251453 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.251439 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-224.ec2.internal\" not found" node="ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.264725 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.264708 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.275029 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.275002 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/61dad8770b188117f71e7a16daae98c8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal\" (UID: \"61dad8770b188117f71e7a16daae98c8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.275133 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.275035 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2d05103e605d22dd69e84834218ff183-config\") pod \"kube-apiserver-proxy-ip-10-0-141-224.ec2.internal\" (UID: \"2d05103e605d22dd69e84834218ff183\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.275133 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.275054 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/61dad8770b188117f71e7a16daae98c8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal\" (UID: \"61dad8770b188117f71e7a16daae98c8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.365633 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.365598 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.376027 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.375978 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/61dad8770b188117f71e7a16daae98c8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal\" (UID: \"61dad8770b188117f71e7a16daae98c8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.376027 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.376027 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/61dad8770b188117f71e7a16daae98c8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal\" (UID: \"61dad8770b188117f71e7a16daae98c8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.376222 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.376071 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2d05103e605d22dd69e84834218ff183-config\") pod \"kube-apiserver-proxy-ip-10-0-141-224.ec2.internal\" (UID: \"2d05103e605d22dd69e84834218ff183\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.376222 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.376114 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2d05103e605d22dd69e84834218ff183-config\") pod \"kube-apiserver-proxy-ip-10-0-141-224.ec2.internal\" (UID: \"2d05103e605d22dd69e84834218ff183\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.376222 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.376124 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/61dad8770b188117f71e7a16daae98c8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal\" (UID: \"61dad8770b188117f71e7a16daae98c8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.376222 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.376170 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/61dad8770b188117f71e7a16daae98c8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal\" (UID: \"61dad8770b188117f71e7a16daae98c8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.466500 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.466423 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.550000 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.549966 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.554058 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.554039 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.567023 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.566982 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.667515 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.667470 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.768125 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.768038 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.868638 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.868600 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.878917 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.878897 2560 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:52:22.879097 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.879076 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:52:22.879140 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.879077 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:52:22.963752 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.963705 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:47:21 +0000 UTC" deadline="2027-09-14 09:23:44.033608349 +0000 UTC" Apr 17 07:52:22.963752 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.963746 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12361h31m21.069864107s" Apr 17 07:52:22.968757 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:22.968731 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-224.ec2.internal\" not found" Apr 17 07:52:22.969931 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.969913 2560 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:22.974132 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.974111 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.974214 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.974141 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:52:22.984217 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.984194 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:52:22.985816 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.985801 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" Apr 17 07:52:22.986809 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.986791 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:52:22.994873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:22.994859 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:52:23.006611 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.006582 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9zp52" Apr 17 07:52:23.014962 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.014936 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9zp52" Apr 17 07:52:23.040014 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:23.039953 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61dad8770b188117f71e7a16daae98c8.slice/crio-7e6543062c1d4717b5f391388a9e7a4ec10b0d29aa0739292138fc03faf4d84b WatchSource:0}: Error finding container 7e6543062c1d4717b5f391388a9e7a4ec10b0d29aa0739292138fc03faf4d84b: Status 404 returned error can't find the container with id 7e6543062c1d4717b5f391388a9e7a4ec10b0d29aa0739292138fc03faf4d84b Apr 17 07:52:23.040407 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:23.040389 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d05103e605d22dd69e84834218ff183.slice/crio-64d197e054e8fc1554fd3cd9aa9ff6c32b9dbd5d222d0ae0a9b6ebca563b4837 WatchSource:0}: Error finding container 64d197e054e8fc1554fd3cd9aa9ff6c32b9dbd5d222d0ae0a9b6ebca563b4837: Status 404 returned error can't find the container with id 64d197e054e8fc1554fd3cd9aa9ff6c32b9dbd5d222d0ae0a9b6ebca563b4837 Apr 17 07:52:23.044911 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.044888 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:52:23.131157 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.130919 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" event={"ID":"61dad8770b188117f71e7a16daae98c8","Type":"ContainerStarted","Data":"7e6543062c1d4717b5f391388a9e7a4ec10b0d29aa0739292138fc03faf4d84b"} Apr 17 07:52:23.131755 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.131732 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" event={"ID":"2d05103e605d22dd69e84834218ff183","Type":"ContainerStarted","Data":"64d197e054e8fc1554fd3cd9aa9ff6c32b9dbd5d222d0ae0a9b6ebca563b4837"} Apr 17 07:52:23.559702 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.559667 2560 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:23.952542 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.952505 2560 apiserver.go:52] "Watching apiserver" Apr 17 07:52:23.961309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.961276 2560 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:52:23.963200 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.963167 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-vqblm","kube-system/konnectivity-agent-65l87","kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal","openshift-cluster-node-tuning-operator/tuned-bpgx2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal","openshift-multus/multus-additional-cni-plugins-88cn6","openshift-multus/network-metrics-daemon-k4vcb","openshift-ovn-kubernetes/ovnkube-node-zfq9h","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6","openshift-dns/node-resolver-zh8tn","openshift-image-registry/node-ca-8dpsb","openshift-multus/multus-pqj85","openshift-network-diagnostics/network-check-target-vchg7"] Apr 17 07:52:23.965403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.965376 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:23.967682 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.967657 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.967785 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.967713 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.967874 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.967854 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:52:23.967978 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.967856 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-m2q46\"" Apr 17 07:52:23.969219 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.969198 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:23.969317 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.969282 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.970686 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.970667 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.971270 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.971230 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d2rqr\"" Apr 17 07:52:23.971598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.971422 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:52:23.971598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.971432 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ft4ks\"" Apr 17 07:52:23.971598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.971475 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.971598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.971475 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.971598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.971434 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:52:23.972293 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.972169 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:23.972293 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:23.972258 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:23.972794 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.972775 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:52:23.972887 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.972780 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sj2h5\"" Apr 17 07:52:23.972952 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.972882 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.973137 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.973116 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.973231 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.973210 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:52:23.973398 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.973378 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:52:23.973951 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.973808 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.975531 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.975510 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.976072 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976053 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.976159 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976082 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:52:23.976217 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976169 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:52:23.976268 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976252 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:52:23.976321 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976098 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:52:23.976321 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976107 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.976523 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.976504 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-xdwvs\"" Apr 17 07:52:23.977336 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.977314 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:23.977826 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.977805 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.977920 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.977904 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bx2wj\"" Apr 17 07:52:23.977980 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.977947 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.977980 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.977949 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:52:23.979114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.979091 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:23.979698 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.979681 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:23.979765 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.979735 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4mb8j\"" Apr 17 07:52:23.980018 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.979970 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.980018 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.979972 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.981111 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.981092 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pqj85" Apr 17 07:52:23.981431 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.981214 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:52:23.982411 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982381 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-os-release\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.982514 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982414 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-var-lib-kubelet\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.982514 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982461 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.982514 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982488 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-kubernetes\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.982670 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982537 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-lib-modules\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.982670 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982565 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.982767 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982691 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-ovn\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.982767 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982721 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-var-lib-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.982767 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982749 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-node-log\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983053 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982805 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-tuned\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.983053 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982842 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-ovnkube-config\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983053 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982882 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:23.983053 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.982958 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-run-netns\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983053 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983022 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:52:23.983053 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:23.983015 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983024 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-systemd\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983093 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-cni-netd\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983117 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983162 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-env-overrides\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983189 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-cnibin\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983272 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.983351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983299 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-kubelet\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983377 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-log-socket\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983378 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983387 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wqc64\"" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983424 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-iptables-alerter-script\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983465 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-etc-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983496 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5m9\" (UniqueName: \"kubernetes.io/projected/322ec8c6-8646-443d-9065-38a19aa96bd1-kube-api-access-9s5m9\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983521 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983564 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-host-slash\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983596 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysctl-d\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.983688 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983669 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcb6\" (UniqueName: \"kubernetes.io/projected/f6ca1d48-95c2-414b-af4e-838843029028-kube-api-access-zmcb6\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983713 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59fbd464-53fc-455b-9630-7e429d74587e-agent-certs\") pod \"konnectivity-agent-65l87\" (UID: \"59fbd464-53fc-455b-9630-7e429d74587e\") " pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983735 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-run\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983757 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-sys\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983780 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-registration-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983802 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59fbd464-53fc-455b-9630-7e429d74587e-konnectivity-ca\") pod \"konnectivity-agent-65l87\" (UID: \"59fbd464-53fc-455b-9630-7e429d74587e\") " pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983858 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-systemd\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983884 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-run-ovn-kubernetes\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.983915 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-cni-bin\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984068 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysconfig\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984095 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysctl-conf\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984126 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-tmp\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984190 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984177 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-cni-binary-copy\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.984743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984207 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-socket-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.984743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984230 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddnw6\" (UniqueName: \"kubernetes.io/projected/475827a7-8f6f-4574-b5a7-05d38afa9444-kube-api-access-ddnw6\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.984743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984262 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-modprobe-d\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.984743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984293 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.984743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984316 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-device-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.984743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984337 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-sys-fs\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.985047 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.984856 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-host\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.986220 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.985911 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xjh45\"" Apr 17 07:52:23.986220 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986059 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986596 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6vp\" (UniqueName: \"kubernetes.io/projected/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-kube-api-access-9d6vp\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986654 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-slash\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986695 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/322ec8c6-8646-443d-9065-38a19aa96bd1-ovn-node-metrics-cert\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986736 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-system-cni-dir\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986798 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxq7\" (UniqueName: \"kubernetes.io/projected/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-kube-api-access-lwxq7\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986832 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986865 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-systemd-units\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986914 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-ovnkube-script-lib\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986941 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-etc-selinux\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:23.987114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:23.986972 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9x55\" (UniqueName: \"kubernetes.io/projected/e2c81a49-c679-48e0-9245-cea52134eecc-kube-api-access-s9x55\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.015597 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.015551 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:47:23 +0000 UTC" deadline="2027-11-25 16:19:27.782410602 +0000 UTC" Apr 17 07:52:24.015597 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.015576 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14096h27m3.766836644s" Apr 17 07:52:24.074751 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.074725 2560 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:52:24.087199 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087166 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-run-netns\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087199 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087204 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-systemd\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087223 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-cni-netd\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087231 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-run-netns\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087249 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087285 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-systemd\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087293 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087289 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-env-overrides\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087324 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-cni-netd\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087334 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-hostroot\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.087403 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087381 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5m6n\" (UniqueName: \"kubernetes.io/projected/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-kube-api-access-x5m6n\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087424 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-cnibin\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087455 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-cnibin\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087468 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087498 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-kubelet\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087524 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-log-socket\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087566 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-kubelet\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087579 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-log-socket\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087598 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-hosts-file\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087626 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-iptables-alerter-script\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087650 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-etc-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087676 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5m9\" (UniqueName: \"kubernetes.io/projected/322ec8c6-8646-443d-9065-38a19aa96bd1-kube-api-access-9s5m9\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087703 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-etc-kubernetes\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087735 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-env-overrides\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087730 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.087782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087780 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-host-slash\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087805 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysctl-d\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087832 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmcb6\" (UniqueName: \"kubernetes.io/projected/f6ca1d48-95c2-414b-af4e-838843029028-kube-api-access-zmcb6\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087872 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59fbd464-53fc-455b-9630-7e429d74587e-agent-certs\") pod \"konnectivity-agent-65l87\" (UID: \"59fbd464-53fc-455b-9630-7e429d74587e\") " pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087896 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-run\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087920 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-sys\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087944 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-registration-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.087970 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfxq\" (UniqueName: \"kubernetes.io/projected/03b7cc9b-71c3-4b06-9c37-a26058521703-kube-api-access-vcfxq\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088020 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-etc-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088030 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-socket-dir-parent\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088069 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-netns\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088086 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088099 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-cni-bin\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088124 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-sys\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088134 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59fbd464-53fc-455b-9630-7e429d74587e-konnectivity-ca\") pod \"konnectivity-agent-65l87\" (UID: \"59fbd464-53fc-455b-9630-7e429d74587e\") " pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088187 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-iptables-alerter-script\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088194 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-systemd\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.088464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088221 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-run-ovn-kubernetes\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088190 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-registration-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088235 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-host-slash\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088246 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-cni-bin\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088274 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03b7cc9b-71c3-4b06-9c37-a26058521703-host\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088286 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-run-ovn-kubernetes\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088291 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-systemd\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088248 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088311 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-tmp-dir\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088090 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-run\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088329 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysctl-d\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088338 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysconfig\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088348 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-cni-bin\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088367 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysctl-conf\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088391 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-tmp\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088394 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysconfig\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088416 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-kubelet\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.089309 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088440 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-multus-certs\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088461 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-os-release\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088482 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-conf-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088503 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-sysctl-conf\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088508 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-cni-binary-copy\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088538 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-socket-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088565 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-cni-multus\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088591 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088593 2560 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088615 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddnw6\" (UniqueName: \"kubernetes.io/projected/475827a7-8f6f-4574-b5a7-05d38afa9444-kube-api-access-ddnw6\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088640 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-modprobe-d\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088665 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088743 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-device-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088795 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-socket-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.088907 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-modprobe-d\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089352 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-sys-fs\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089382 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-system-cni-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089398 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-device-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089413 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-cni-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089428 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-sys-fs\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089442 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-daemon-config\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089472 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-host\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089511 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6vp\" (UniqueName: \"kubernetes.io/projected/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-kube-api-access-9d6vp\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089535 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-slash\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089583 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/322ec8c6-8646-443d-9065-38a19aa96bd1-ovn-node-metrics-cert\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089585 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-host\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089167 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089613 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cbe3752-39a7-4fce-ab72-9683a62aba37-cni-binary-copy\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089630 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-host-slash\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089638 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmps\" (UniqueName: \"kubernetes.io/projected/0cbe3752-39a7-4fce-ab72-9683a62aba37-kube-api-access-tdmps\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089773 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-system-cni-dir\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089814 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxq7\" (UniqueName: \"kubernetes.io/projected/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-kube-api-access-lwxq7\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089845 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-system-cni-dir\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089873 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-cni-binary-copy\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.090766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089849 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089921 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-systemd-units\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.089930 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089945 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-ovnkube-script-lib\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.089971 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-etc-selinux\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.090009 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:24.589968536 +0000 UTC m=+3.077090326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090040 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9x55\" (UniqueName: \"kubernetes.io/projected/e2c81a49-c679-48e0-9245-cea52134eecc-kube-api-access-s9x55\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090048 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2c81a49-c679-48e0-9245-cea52134eecc-etc-selinux\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090068 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-cnibin\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090090 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-systemd-units\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090095 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-os-release\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090121 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-var-lib-kubelet\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090145 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03b7cc9b-71c3-4b06-9c37-a26058521703-serviceca\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090171 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090198 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-kubernetes\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090222 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-lib-modules\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090245 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.091530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090267 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-ovn\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090294 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-var-lib-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090317 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-node-log\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090344 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-k8s-cni-cncf-io\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090374 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-tuned\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090400 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-ovnkube-config\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090409 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/59fbd464-53fc-455b-9630-7e429d74587e-konnectivity-ca\") pod \"konnectivity-agent-65l87\" (UID: \"59fbd464-53fc-455b-9630-7e429d74587e\") " pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090417 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-kubernetes\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090484 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-node-log\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090574 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-ovnkube-script-lib\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090625 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-var-lib-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090706 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-lib-modules\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090745 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-ovn\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090813 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-var-lib-kubelet\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090851 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/322ec8c6-8646-443d-9065-38a19aa96bd1-run-openvswitch\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090897 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/475827a7-8f6f-4574-b5a7-05d38afa9444-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090907 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/475827a7-8f6f-4574-b5a7-05d38afa9444-os-release\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.092290 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.090930 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/322ec8c6-8646-443d-9065-38a19aa96bd1-ovnkube-config\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.092909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.092424 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/59fbd464-53fc-455b-9630-7e429d74587e-agent-certs\") pod \"konnectivity-agent-65l87\" (UID: \"59fbd464-53fc-455b-9630-7e429d74587e\") " pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:24.092909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.092798 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-tmp\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.092909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.092847 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-etc-tuned\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.093071 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.093056 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/322ec8c6-8646-443d-9065-38a19aa96bd1-ovn-node-metrics-cert\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.099368 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.099302 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmcb6\" (UniqueName: \"kubernetes.io/projected/f6ca1d48-95c2-414b-af4e-838843029028-kube-api-access-zmcb6\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:24.099851 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.099828 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5m9\" (UniqueName: \"kubernetes.io/projected/322ec8c6-8646-443d-9065-38a19aa96bd1-kube-api-access-9s5m9\") pod \"ovnkube-node-zfq9h\" (UID: \"322ec8c6-8646-443d-9065-38a19aa96bd1\") " pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.100254 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.100225 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxq7\" (UniqueName: \"kubernetes.io/projected/e9c30bee-b3b0-40b8-8f95-46b04aca3c77-kube-api-access-lwxq7\") pod \"iptables-alerter-vqblm\" (UID: \"e9c30bee-b3b0-40b8-8f95-46b04aca3c77\") " pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.100408 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.100386 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6vp\" (UniqueName: \"kubernetes.io/projected/271684f4-9f94-4d1d-9c77-9fbf3a3219c9-kube-api-access-9d6vp\") pod \"tuned-bpgx2\" (UID: \"271684f4-9f94-4d1d-9c77-9fbf3a3219c9\") " pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.101421 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.101378 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddnw6\" (UniqueName: \"kubernetes.io/projected/475827a7-8f6f-4574-b5a7-05d38afa9444-kube-api-access-ddnw6\") pod \"multus-additional-cni-plugins-88cn6\" (UID: \"475827a7-8f6f-4574-b5a7-05d38afa9444\") " pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.101705 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.101686 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9x55\" (UniqueName: \"kubernetes.io/projected/e2c81a49-c679-48e0-9245-cea52134eecc-kube-api-access-s9x55\") pod \"aws-ebs-csi-driver-node-6qhl6\" (UID: \"e2c81a49-c679-48e0-9245-cea52134eecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.191043 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.190981 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03b7cc9b-71c3-4b06-9c37-a26058521703-host\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191051 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-tmp-dir\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191064 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03b7cc9b-71c3-4b06-9c37-a26058521703-host\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191081 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-kubelet\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191123 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-multus-certs\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191124 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-kubelet\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191151 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-os-release\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191158 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-multus-certs\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191201 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191189 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-conf-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191210 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-os-release\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191219 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-cni-multus\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191245 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191271 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-cni-multus\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191300 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-conf-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191273 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-system-cni-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191324 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-system-cni-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191366 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-cni-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191406 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-daemon-config\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191416 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-tmp-dir\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191425 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-cni-dir\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191436 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cbe3752-39a7-4fce-ab72-9683a62aba37-cni-binary-copy\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191461 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmps\" (UniqueName: \"kubernetes.io/projected/0cbe3752-39a7-4fce-ab72-9683a62aba37-kube-api-access-tdmps\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191501 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-cnibin\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191520 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03b7cc9b-71c3-4b06-9c37-a26058521703-serviceca\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191543 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-k8s-cni-cncf-io\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.191579 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191590 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-hostroot\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191614 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5m6n\" (UniqueName: \"kubernetes.io/projected/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-kube-api-access-x5m6n\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191621 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-k8s-cni-cncf-io\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191647 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-hosts-file\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191659 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-cnibin\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191670 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-hostroot\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191674 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-etc-kubernetes\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191714 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfxq\" (UniqueName: \"kubernetes.io/projected/03b7cc9b-71c3-4b06-9c37-a26058521703-kube-api-access-vcfxq\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191717 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-etc-kubernetes\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191765 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-socket-dir-parent\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191767 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-hosts-file\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191814 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-netns\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191826 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-socket-dir-parent\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191844 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-cni-bin\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191865 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-run-netns\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191907 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cbe3752-39a7-4fce-ab72-9683a62aba37-host-var-lib-cni-bin\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191966 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cbe3752-39a7-4fce-ab72-9683a62aba37-cni-binary-copy\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.191968 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cbe3752-39a7-4fce-ab72-9683a62aba37-multus-daemon-config\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.192233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.192057 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03b7cc9b-71c3-4b06-9c37-a26058521703-serviceca\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.199730 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.199703 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:24.199730 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.199733 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:24.199876 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.199748 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:24.199876 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.199823 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:24.699804635 +0000 UTC m=+3.186926421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:24.202288 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.202270 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmps\" (UniqueName: \"kubernetes.io/projected/0cbe3752-39a7-4fce-ab72-9683a62aba37-kube-api-access-tdmps\") pod \"multus-pqj85\" (UID: \"0cbe3752-39a7-4fce-ab72-9683a62aba37\") " pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.203033 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.202935 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfxq\" (UniqueName: \"kubernetes.io/projected/03b7cc9b-71c3-4b06-9c37-a26058521703-kube-api-access-vcfxq\") pod \"node-ca-8dpsb\" (UID: \"03b7cc9b-71c3-4b06-9c37-a26058521703\") " pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.203033 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.202947 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5m6n\" (UniqueName: \"kubernetes.io/projected/e87601ff-22f7-4eb6-bb9e-5d78a6b02e12-kube-api-access-x5m6n\") pod \"node-resolver-zh8tn\" (UID: \"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12\") " pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.279675 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.279633 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vqblm" Apr 17 07:52:24.288529 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.288492 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:24.295174 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.295149 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" Apr 17 07:52:24.300802 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.300771 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-88cn6" Apr 17 07:52:24.306444 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.306421 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:24.312041 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.312020 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" Apr 17 07:52:24.318558 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.318536 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zh8tn" Apr 17 07:52:24.325063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.325045 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8dpsb" Apr 17 07:52:24.331699 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.331678 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pqj85" Apr 17 07:52:24.362467 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.362424 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:24.593897 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.593859 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:24.594108 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.594015 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:24.594108 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.594100 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.594080277 +0000 UTC m=+4.081202059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:24.625566 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.625526 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475827a7_8f6f_4574_b5a7_05d38afa9444.slice/crio-1fa22c2e237089268fa5eece42502278080ebd8d67e06253d929775578a969e0 WatchSource:0}: Error finding container 1fa22c2e237089268fa5eece42502278080ebd8d67e06253d929775578a969e0: Status 404 returned error can't find the container with id 1fa22c2e237089268fa5eece42502278080ebd8d67e06253d929775578a969e0 Apr 17 07:52:24.627407 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.627385 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod271684f4_9f94_4d1d_9c77_9fbf3a3219c9.slice/crio-dd143deddf2c573e7c778e81af4b9d92edf1d60dbd7ed423ef61eacbff7919b8 WatchSource:0}: Error finding container dd143deddf2c573e7c778e81af4b9d92edf1d60dbd7ed423ef61eacbff7919b8: Status 404 returned error can't find the container with id dd143deddf2c573e7c778e81af4b9d92edf1d60dbd7ed423ef61eacbff7919b8 Apr 17 07:52:24.629117 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.629061 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b7cc9b_71c3_4b06_9c37_a26058521703.slice/crio-a6731c2c863dec3188e05a9c98cf83999dd1857f879869b360d679c8f76e41d8 WatchSource:0}: Error finding container a6731c2c863dec3188e05a9c98cf83999dd1857f879869b360d679c8f76e41d8: Status 404 returned error can't find the container with id a6731c2c863dec3188e05a9c98cf83999dd1857f879869b360d679c8f76e41d8 Apr 17 07:52:24.632151 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.632131 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cbe3752_39a7_4fce_ab72_9683a62aba37.slice/crio-d92a42f73fe8b732a03ddc631eb72c9b7504be788fe82866f4acf4815ecb7ff3 WatchSource:0}: Error finding container d92a42f73fe8b732a03ddc631eb72c9b7504be788fe82866f4acf4815ecb7ff3: Status 404 returned error can't find the container with id d92a42f73fe8b732a03ddc631eb72c9b7504be788fe82866f4acf4815ecb7ff3 Apr 17 07:52:24.633958 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.633759 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322ec8c6_8646_443d_9065_38a19aa96bd1.slice/crio-cc21bfa182b34697f4fb933bed49410a5998370ec3bca93ea4cca55719e3145a WatchSource:0}: Error finding container cc21bfa182b34697f4fb933bed49410a5998370ec3bca93ea4cca55719e3145a: Status 404 returned error can't find the container with id cc21bfa182b34697f4fb933bed49410a5998370ec3bca93ea4cca55719e3145a Apr 17 07:52:24.635283 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.635247 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87601ff_22f7_4eb6_bb9e_5d78a6b02e12.slice/crio-16e13f389aff71b0778652f338e8582117076ec49e3769a00029f156e0152e6a WatchSource:0}: Error finding container 16e13f389aff71b0778652f338e8582117076ec49e3769a00029f156e0152e6a: Status 404 returned error can't find the container with id 16e13f389aff71b0778652f338e8582117076ec49e3769a00029f156e0152e6a Apr 17 07:52:24.636352 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.636247 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c81a49_c679_48e0_9245_cea52134eecc.slice/crio-fef2913aba4faf20f8464f3ca6849a3e39ebe979a885559dc80687f0efcf0c56 WatchSource:0}: Error finding container fef2913aba4faf20f8464f3ca6849a3e39ebe979a885559dc80687f0efcf0c56: Status 404 returned error can't find the container with id fef2913aba4faf20f8464f3ca6849a3e39ebe979a885559dc80687f0efcf0c56 Apr 17 07:52:24.637397 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.637260 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59fbd464_53fc_455b_9630_7e429d74587e.slice/crio-a7eb42909d22fa06a168b27c746ad4dc01b6530f312a1cb63ffd6da71e15ce37 WatchSource:0}: Error finding container a7eb42909d22fa06a168b27c746ad4dc01b6530f312a1cb63ffd6da71e15ce37: Status 404 returned error can't find the container with id a7eb42909d22fa06a168b27c746ad4dc01b6530f312a1cb63ffd6da71e15ce37 Apr 17 07:52:24.638298 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:24.638255 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c30bee_b3b0_40b8_8f95_46b04aca3c77.slice/crio-c8bc4af3c9f39fbdebaa625a7b836d73bf2f7572d0d3de26bb297f8b024cbe70 WatchSource:0}: Error finding container c8bc4af3c9f39fbdebaa625a7b836d73bf2f7572d0d3de26bb297f8b024cbe70: Status 404 returned error can't find the container with id c8bc4af3c9f39fbdebaa625a7b836d73bf2f7572d0d3de26bb297f8b024cbe70 Apr 17 07:52:24.795520 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:24.795488 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:24.795696 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.795660 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:24.795696 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.795688 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:24.795781 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.795702 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:24.795781 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:24.795761 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.795742224 +0000 UTC m=+4.282864006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:25.016549 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.016305 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:47:23 +0000 UTC" deadline="2027-12-05 10:46:48.097054962 +0000 UTC" Apr 17 07:52:25.016549 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.016342 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14330h54m23.080717428s" Apr 17 07:52:25.149562 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.149524 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"cc21bfa182b34697f4fb933bed49410a5998370ec3bca93ea4cca55719e3145a"} Apr 17 07:52:25.155351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.155274 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8dpsb" event={"ID":"03b7cc9b-71c3-4b06-9c37-a26058521703","Type":"ContainerStarted","Data":"a6731c2c863dec3188e05a9c98cf83999dd1857f879869b360d679c8f76e41d8"} Apr 17 07:52:25.161220 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.161191 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerStarted","Data":"1fa22c2e237089268fa5eece42502278080ebd8d67e06253d929775578a969e0"} Apr 17 07:52:25.173258 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.173189 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vqblm" event={"ID":"e9c30bee-b3b0-40b8-8f95-46b04aca3c77","Type":"ContainerStarted","Data":"c8bc4af3c9f39fbdebaa625a7b836d73bf2f7572d0d3de26bb297f8b024cbe70"} Apr 17 07:52:25.179598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.179564 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqj85" event={"ID":"0cbe3752-39a7-4fce-ab72-9683a62aba37","Type":"ContainerStarted","Data":"d92a42f73fe8b732a03ddc631eb72c9b7504be788fe82866f4acf4815ecb7ff3"} Apr 17 07:52:25.187844 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.187813 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" event={"ID":"271684f4-9f94-4d1d-9c77-9fbf3a3219c9","Type":"ContainerStarted","Data":"dd143deddf2c573e7c778e81af4b9d92edf1d60dbd7ed423ef61eacbff7919b8"} Apr 17 07:52:25.192711 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.192673 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" event={"ID":"2d05103e605d22dd69e84834218ff183","Type":"ContainerStarted","Data":"3d73db8012aa4e15e7986e9dd56d7a74ba249930968590ab379a1ebd8ba5d4ac"} Apr 17 07:52:25.196790 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.196741 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-65l87" event={"ID":"59fbd464-53fc-455b-9630-7e429d74587e","Type":"ContainerStarted","Data":"a7eb42909d22fa06a168b27c746ad4dc01b6530f312a1cb63ffd6da71e15ce37"} Apr 17 07:52:25.202739 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.202682 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" event={"ID":"e2c81a49-c679-48e0-9245-cea52134eecc","Type":"ContainerStarted","Data":"fef2913aba4faf20f8464f3ca6849a3e39ebe979a885559dc80687f0efcf0c56"} Apr 17 07:52:25.215617 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.215576 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zh8tn" event={"ID":"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12","Type":"ContainerStarted","Data":"16e13f389aff71b0778652f338e8582117076ec49e3769a00029f156e0152e6a"} Apr 17 07:52:25.605731 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.605690 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:25.605917 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:25.605876 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:25.606002 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:25.605942 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:27.605922044 +0000 UTC m=+6.093043844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:25.807840 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:25.807224 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:25.807840 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:25.807385 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:25.807840 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:25.807403 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:25.807840 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:25.807415 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:25.807840 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:25.807474 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:27.80745444 +0000 UTC m=+6.294576226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:26.139236 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:26.138297 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:26.139236 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:26.138495 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:26.139236 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:26.138639 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:26.139236 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:26.138760 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:26.246118 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:26.245441 2560 generic.go:358] "Generic (PLEG): container finished" podID="61dad8770b188117f71e7a16daae98c8" containerID="e19c65d8ffc6bd58952fd11dfec7889eace9b0c58e59a51b01db5d9d3a42dd6f" exitCode=0 Apr 17 07:52:26.246118 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:26.245717 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" event={"ID":"61dad8770b188117f71e7a16daae98c8","Type":"ContainerDied","Data":"e19c65d8ffc6bd58952fd11dfec7889eace9b0c58e59a51b01db5d9d3a42dd6f"} Apr 17 07:52:26.259851 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:26.259765 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-224.ec2.internal" podStartSLOduration=4.259745914 podStartE2EDuration="4.259745914s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:25.20417387 +0000 UTC m=+3.691295674" watchObservedRunningTime="2026-04-17 07:52:26.259745914 +0000 UTC m=+4.746867719" Apr 17 07:52:27.252110 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:27.252075 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" event={"ID":"61dad8770b188117f71e7a16daae98c8","Type":"ContainerStarted","Data":"1ee354e0927b29177d044a1041c199a3d63e1b3ede9a0214b6b92eda05a01fd6"} Apr 17 07:52:27.622144 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:27.622096 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:27.622491 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:27.622301 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:27.622491 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:27.622367 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:31.62234794 +0000 UTC m=+10.109469727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:27.824827 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:27.824206 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:27.824827 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:27.824389 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:27.824827 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:27.824407 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:27.824827 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:27.824420 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:27.824827 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:27.824477 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:31.824458852 +0000 UTC m=+10.311580652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:28.132142 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:28.132112 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:28.132320 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:28.132245 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:28.132623 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:28.132603 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:28.132713 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:28.132695 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:30.129419 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:30.128716 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:30.129419 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:30.128858 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:30.129419 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:30.129273 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:30.129419 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:30.129370 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:31.661464 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:31.661423 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:31.661920 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:31.661573 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:31.661920 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:31.661644 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:39.661624141 +0000 UTC m=+18.148745938 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:31.862458 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:31.862418 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:31.862651 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:31.862616 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:31.862651 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:31.862643 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:31.862774 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:31.862657 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:31.862774 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:31.862722 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:39.862702318 +0000 UTC m=+18.349824104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:32.129622 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:32.129185 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:32.129622 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:32.129293 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:32.129622 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:32.129431 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:32.129622 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:32.129534 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:34.128705 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:34.128663 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:34.129171 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:34.128664 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:34.129171 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:34.128938 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:34.129171 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:34.128806 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:36.128757 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:36.128721 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:36.129220 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:36.128846 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:36.129220 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:36.128930 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:36.129220 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:36.129055 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:38.128557 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:38.128522 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:38.129028 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:38.128572 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:38.129028 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:38.128659 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:38.129028 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:38.128797 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:39.720748 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:39.720710 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:39.721275 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:39.720879 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:39.721275 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:39.720947 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:55.720930325 +0000 UTC m=+34.208052111 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:39.922616 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:39.922579 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:39.922768 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:39.922737 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:39.922768 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:39.922754 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:39.922768 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:39.922765 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:39.922904 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:39.922832 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:55.922811305 +0000 UTC m=+34.409933130 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:40.128650 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:40.128613 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:40.128794 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:40.128613 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:40.128794 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:40.128728 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:40.128887 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:40.128798 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:42.129758 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.129567 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:42.130626 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:42.129837 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:42.130626 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.129664 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:42.130626 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:42.130015 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:42.281186 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.280935 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqj85" event={"ID":"0cbe3752-39a7-4fce-ab72-9683a62aba37","Type":"ContainerStarted","Data":"6fecd5ef63e1e8e4b9e1a0b5350e9b3e63cbc8fedfc1b2b292edce2cfe3f8b32"} Apr 17 07:52:42.282592 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.282564 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" event={"ID":"271684f4-9f94-4d1d-9c77-9fbf3a3219c9","Type":"ContainerStarted","Data":"08d97ad1366c591f6e3d80a94539935c12445e5d69758d3c4699927592f0b203"} Apr 17 07:52:42.284095 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.284055 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-65l87" event={"ID":"59fbd464-53fc-455b-9630-7e429d74587e","Type":"ContainerStarted","Data":"c53ab3842b5c750242f4f040190135bd1430f5fadf9174355470e83e4869a7de"} Apr 17 07:52:42.286825 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.286643 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" event={"ID":"e2c81a49-c679-48e0-9245-cea52134eecc","Type":"ContainerStarted","Data":"6c40e333ff6d43795737c046b51556a52d90038a0b9a157e7a03c2218fb6dbe5"} Apr 17 07:52:42.289920 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.289897 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zh8tn" event={"ID":"e87601ff-22f7-4eb6-bb9e-5d78a6b02e12","Type":"ContainerStarted","Data":"f7cf003cb758b3d1e97d95a406118c5f044186f59f68aa89b5918ffcbc045157"} Apr 17 07:52:42.291817 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.291792 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"91ec5588a442bfd13c1b20081a36b13160068c3a26b2eb4f652375107c7598d4"} Apr 17 07:52:42.291902 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.291827 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"664640f376709cbeab78b498809717631dd404d2e1136e7f5bd9df86c9062d57"} Apr 17 07:52:42.293255 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.293230 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8dpsb" event={"ID":"03b7cc9b-71c3-4b06-9c37-a26058521703","Type":"ContainerStarted","Data":"7066ca734136e86758e6134f44e05844ef852397d14f0ccfa55460afc1e17309"} Apr 17 07:52:42.294898 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.294839 2560 generic.go:358] "Generic (PLEG): container finished" podID="475827a7-8f6f-4574-b5a7-05d38afa9444" containerID="62e68d72591dc6363474622f8560e7b65c7f70aeb19275c2dc21176592bcb852" exitCode=0 Apr 17 07:52:42.294898 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.294873 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerDied","Data":"62e68d72591dc6363474622f8560e7b65c7f70aeb19275c2dc21176592bcb852"} Apr 17 07:52:42.301509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.301311 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-224.ec2.internal" podStartSLOduration=20.301296216 podStartE2EDuration="20.301296216s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:27.26516603 +0000 UTC m=+5.752287835" watchObservedRunningTime="2026-04-17 07:52:42.301296216 +0000 UTC m=+20.788418021" Apr 17 07:52:42.301975 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.301930 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pqj85" podStartSLOduration=3.234968337 podStartE2EDuration="20.301918632s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.634362659 +0000 UTC m=+3.121484443" lastFinishedPulling="2026-04-17 07:52:41.701312942 +0000 UTC m=+20.188434738" observedRunningTime="2026-04-17 07:52:42.300902342 +0000 UTC m=+20.788024144" watchObservedRunningTime="2026-04-17 07:52:42.301918632 +0000 UTC m=+20.789040488" Apr 17 07:52:42.314665 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.314613 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-65l87" podStartSLOduration=3.356444068 podStartE2EDuration="20.314561024s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.639328678 +0000 UTC m=+3.126450460" lastFinishedPulling="2026-04-17 07:52:41.59744562 +0000 UTC m=+20.084567416" observedRunningTime="2026-04-17 07:52:42.314547596 +0000 UTC m=+20.801669400" watchObservedRunningTime="2026-04-17 07:52:42.314561024 +0000 UTC m=+20.801682828" Apr 17 07:52:42.327662 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.327595 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8dpsb" podStartSLOduration=3.361567103 podStartE2EDuration="20.327579289s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.631466669 +0000 UTC m=+3.118588451" lastFinishedPulling="2026-04-17 07:52:41.597478841 +0000 UTC m=+20.084600637" observedRunningTime="2026-04-17 07:52:42.327517575 +0000 UTC m=+20.814639379" watchObservedRunningTime="2026-04-17 07:52:42.327579289 +0000 UTC m=+20.814701094" Apr 17 07:52:42.343712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.343671 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bpgx2" podStartSLOduration=3.370491696 podStartE2EDuration="20.343657554s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.62933876 +0000 UTC m=+3.116460555" lastFinishedPulling="2026-04-17 07:52:41.602504618 +0000 UTC m=+20.089626413" observedRunningTime="2026-04-17 07:52:42.343347139 +0000 UTC m=+20.830468941" watchObservedRunningTime="2026-04-17 07:52:42.343657554 +0000 UTC m=+20.830779355" Apr 17 07:52:42.385538 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.385497 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zh8tn" podStartSLOduration=3.425096578 podStartE2EDuration="20.385482242s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.637057765 +0000 UTC m=+3.124179552" lastFinishedPulling="2026-04-17 07:52:41.59744342 +0000 UTC m=+20.084565216" observedRunningTime="2026-04-17 07:52:42.384715029 +0000 UTC m=+20.871836831" watchObservedRunningTime="2026-04-17 07:52:42.385482242 +0000 UTC m=+20.872604045" Apr 17 07:52:42.886522 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:42.886316 2560 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:52:43.051097 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.050969 2560 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:52:42.886520872Z","UUID":"3a0a1039-d559-4a2a-89a3-efe10a512306","Handler":null,"Name":"","Endpoint":""} Apr 17 07:52:43.052679 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.052653 2560 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:52:43.052679 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.052683 2560 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:52:43.299668 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.299586 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" event={"ID":"e2c81a49-c679-48e0-9245-cea52134eecc","Type":"ContainerStarted","Data":"a79e76ac5718cc445ddb7d650bca9b134a9e55ad3007604af8b64bc56c299152"} Apr 17 07:52:43.302899 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.302871 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"1d98e24952e374a0e2558ce239a8f774ec02eb7239920333d6e22a2f08959f5f"} Apr 17 07:52:43.303043 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.302907 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"9f1dfd5ffc0b726b338d1e9188037df58b452cda24d90712f07bea80c7f5cf74"} Apr 17 07:52:43.303043 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.302924 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"c12e1b19aca63c6ed9f5e3673ad3f058050f8a84c697d1d1e60bacee40c858e0"} Apr 17 07:52:43.303043 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.302937 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"0daae97b12f7bdec23639407c2a545f4510f2420f34f7078905d417b2546cb6b"} Apr 17 07:52:43.304878 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.304856 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vqblm" event={"ID":"e9c30bee-b3b0-40b8-8f95-46b04aca3c77","Type":"ContainerStarted","Data":"573bc68c5a3c46206645ddcd8c266223aedeae5f4f5b69638395ae9d0d48ca2c"} Apr 17 07:52:43.318907 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:43.318861 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vqblm" podStartSLOduration=4.361577112 podStartE2EDuration="21.318845956s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.640236926 +0000 UTC m=+3.127358706" lastFinishedPulling="2026-04-17 07:52:41.59750577 +0000 UTC m=+20.084627550" observedRunningTime="2026-04-17 07:52:43.318032054 +0000 UTC m=+21.805153858" watchObservedRunningTime="2026-04-17 07:52:43.318845956 +0000 UTC m=+21.805967762" Apr 17 07:52:44.128063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:44.128030 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:44.128294 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:44.128164 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:44.128294 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:44.128271 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:44.128294 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:44.128158 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:44.308608 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:44.308570 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" event={"ID":"e2c81a49-c679-48e0-9245-cea52134eecc","Type":"ContainerStarted","Data":"6a1eca682283b5f265af15648b36dab8f6db8749ab10dc94d7504b47eaee61bf"} Apr 17 07:52:44.326268 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:44.326215 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qhl6" podStartSLOduration=3.253633845 podStartE2EDuration="22.326201871s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.63861055 +0000 UTC m=+3.125732332" lastFinishedPulling="2026-04-17 07:52:43.711178577 +0000 UTC m=+22.198300358" observedRunningTime="2026-04-17 07:52:44.326096032 +0000 UTC m=+22.813217826" watchObservedRunningTime="2026-04-17 07:52:44.326201871 +0000 UTC m=+22.813323656" Apr 17 07:52:44.809209 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:44.809171 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:44.809785 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:44.809758 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:46.129027 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:46.128979 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:46.129714 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:46.129034 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:46.129714 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:46.129148 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:46.129714 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:46.129290 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:47.315941 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:47.315907 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"4cc232fb8016fb08767758509567bb2c0cf66814ae8bcbddb5e9cc1e34b3c80a"} Apr 17 07:52:47.317374 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:47.317347 2560 generic.go:358] "Generic (PLEG): container finished" podID="475827a7-8f6f-4574-b5a7-05d38afa9444" containerID="d8ca71f1cc8406254849ec8bc47e733e5f9fad71bd37bd50857c3cf8f79c7500" exitCode=0 Apr 17 07:52:47.317490 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:47.317375 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerDied","Data":"d8ca71f1cc8406254849ec8bc47e733e5f9fad71bd37bd50857c3cf8f79c7500"} Apr 17 07:52:47.943122 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:47.942777 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:47.943122 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:47.942913 2560 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:52:47.943799 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:47.943774 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-65l87" Apr 17 07:52:48.128845 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:48.128802 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:48.128845 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:48.128835 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:48.129109 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:48.128949 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:48.129109 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:48.129069 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:48.320880 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:48.320851 2560 generic.go:358] "Generic (PLEG): container finished" podID="475827a7-8f6f-4574-b5a7-05d38afa9444" containerID="1ecac54c549a13171e5eb89a1f293bf04cb8b822437ec25a0784576886837bcc" exitCode=0 Apr 17 07:52:48.321292 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:48.320916 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerDied","Data":"1ecac54c549a13171e5eb89a1f293bf04cb8b822437ec25a0784576886837bcc"} Apr 17 07:52:49.326300 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:49.326083 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" event={"ID":"322ec8c6-8646-443d-9065-38a19aa96bd1","Type":"ContainerStarted","Data":"b6178cb725654fa1e31a965c465f309cb5cdaa6fd9d7e6b9a302cc119af674b3"} Apr 17 07:52:49.326793 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:49.326420 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:49.328378 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:49.328354 2560 generic.go:358] "Generic (PLEG): container finished" podID="475827a7-8f6f-4574-b5a7-05d38afa9444" containerID="8c3d145e488a56c6fa7611169818c1aef07d59ce2e27c87dbd83be39b93cd544" exitCode=0 Apr 17 07:52:49.328490 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:49.328393 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerDied","Data":"8c3d145e488a56c6fa7611169818c1aef07d59ce2e27c87dbd83be39b93cd544"} Apr 17 07:52:49.341175 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:49.341154 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:49.350617 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:49.350579 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" podStartSLOduration=9.85803394 podStartE2EDuration="27.35056705s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.636828892 +0000 UTC m=+3.123950678" lastFinishedPulling="2026-04-17 07:52:42.129361999 +0000 UTC m=+20.616483788" observedRunningTime="2026-04-17 07:52:49.350091223 +0000 UTC m=+27.837213025" watchObservedRunningTime="2026-04-17 07:52:49.35056705 +0000 UTC m=+27.837688852" Apr 17 07:52:50.131696 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.131654 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:50.131866 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.131654 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:50.131866 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:50.131787 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:50.131866 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:50.131829 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:50.331383 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.331127 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:50.331792 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.331403 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:50.348311 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.348281 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:52:50.487759 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.487685 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vchg7"] Apr 17 07:52:50.487891 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.487832 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:50.487959 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:50.487933 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:50.490481 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.490342 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k4vcb"] Apr 17 07:52:50.490481 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:50.490472 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:50.490670 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:50.490574 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:52.129263 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:52.129226 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:52.129912 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:52.129345 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:52.129912 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:52.129374 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:52.129912 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:52.129484 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:54.131182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.131143 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:54.131698 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.131152 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:54.131698 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:54.131275 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vchg7" podUID="82c7c47d-33d8-4e71-8695-11aab98b699d" Apr 17 07:52:54.131698 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:54.131347 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:52:54.797698 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.797623 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-224.ec2.internal" event="NodeReady" Apr 17 07:52:54.797864 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.797777 2560 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:54.837860 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.837822 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4k2tj"] Apr 17 07:52:54.875051 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.875012 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mxsbk"] Apr 17 07:52:54.875212 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.875133 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:54.877627 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.877597 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:54.877627 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.877597 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:54.877814 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.877785 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d4l2f\"" Apr 17 07:52:54.897969 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.897945 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4k2tj"] Apr 17 07:52:54.898077 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.897978 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mxsbk"] Apr 17 07:52:54.898077 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.897976 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:54.900147 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.900126 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:54.900256 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.900192 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:54.900256 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.900194 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fmxgb\"" Apr 17 07:52:54.900256 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:54.900192 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:55.036402 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.036372 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.036402 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.036411 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-config-volume\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.036609 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.036426 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:55.036609 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.036453 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6n55\" (UniqueName: \"kubernetes.io/projected/f273b17c-4ccf-45b2-93e0-868dd9134101-kube-api-access-f6n55\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:55.036609 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.036515 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8f9\" (UniqueName: \"kubernetes.io/projected/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-kube-api-access-rh8f9\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.036609 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.036582 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-tmp-dir\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.136898 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.136857 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-tmp-dir\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.136917 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.136955 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-config-volume\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.136981 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.137028 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6n55\" (UniqueName: \"kubernetes.io/projected/f273b17c-4ccf-45b2-93e0-868dd9134101-kube-api-access-f6n55\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.137051 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8f9\" (UniqueName: \"kubernetes.io/projected/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-kube-api-access-rh8f9\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.137120 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.137194 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:55.637174125 +0000 UTC m=+34.124295925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.137127 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.137287 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-tmp-dir\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.137566 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.137309 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:55.637293664 +0000 UTC m=+34.124415445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:52:55.137908 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.137618 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-config-volume\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.146656 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.146634 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8f9\" (UniqueName: \"kubernetes.io/projected/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-kube-api-access-rh8f9\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.146928 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.146906 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6n55\" (UniqueName: \"kubernetes.io/projected/f273b17c-4ccf-45b2-93e0-868dd9134101-kube-api-access-f6n55\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:55.640065 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.640025 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:55.640148 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.640134 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:55.640201 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.640168 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:55.640241 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.640233 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:56.640217727 +0000 UTC m=+35.127339508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:52:55.640287 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.640244 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:55.640319 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.640304 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:56.640290358 +0000 UTC m=+35.127412142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:52:55.740767 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.740682 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:55.740909 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.740804 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:55.740909 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.740855 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:27.740843516 +0000 UTC m=+66.227965296 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:55.942173 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:55.942134 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:55.942344 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.942322 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:55.942399 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.942354 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:55.942399 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.942370 2560 projected.go:194] Error preparing data for projected volume kube-api-access-7skwl for pod openshift-network-diagnostics/network-check-target-vchg7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:55.942462 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:55.942434 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl podName:82c7c47d-33d8-4e71-8695-11aab98b699d nodeName:}" failed. No retries permitted until 2026-04-17 07:53:27.942416136 +0000 UTC m=+66.429537923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7skwl" (UniqueName: "kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl") pod "network-check-target-vchg7" (UID: "82c7c47d-33d8-4e71-8695-11aab98b699d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:56.128576 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.128538 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:52:56.128749 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.128540 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:52:56.131325 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.131304 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n45c2\"" Apr 17 07:52:56.131325 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.131319 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:56.131509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.131341 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:56.131509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.131345 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:56.131509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.131440 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jkb85\"" Apr 17 07:52:56.344300 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.344260 2560 generic.go:358] "Generic (PLEG): container finished" podID="475827a7-8f6f-4574-b5a7-05d38afa9444" containerID="b1ba937b92819be72503629118e6d88b33c9a92a1cd3b495ecc1fd03950642ac" exitCode=0 Apr 17 07:52:56.344659 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.344309 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerDied","Data":"b1ba937b92819be72503629118e6d88b33c9a92a1cd3b495ecc1fd03950642ac"} Apr 17 07:52:56.647305 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.647030 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:56.647305 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:56.647287 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:56.647518 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:56.647169 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:56.647518 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:56.647402 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:58.647381042 +0000 UTC m=+37.134502828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:52:56.647518 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:56.647403 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:56.647518 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:56.647454 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:58.647437763 +0000 UTC m=+37.134559559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:52:57.349340 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:57.349313 2560 generic.go:358] "Generic (PLEG): container finished" podID="475827a7-8f6f-4574-b5a7-05d38afa9444" containerID="4af7783a0cd23e54fa0f2a0a0cbe584e9af5aef4da662ede99ac3fc9338a4542" exitCode=0 Apr 17 07:52:57.349732 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:57.349347 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerDied","Data":"4af7783a0cd23e54fa0f2a0a0cbe584e9af5aef4da662ede99ac3fc9338a4542"} Apr 17 07:52:58.353514 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:58.353485 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88cn6" event={"ID":"475827a7-8f6f-4574-b5a7-05d38afa9444","Type":"ContainerStarted","Data":"a6972d7337b5e5c0320bc49cd6401f600c51e626b6b1e2c8deed8a18055bce17"} Apr 17 07:52:58.376068 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:58.376020 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-88cn6" podStartSLOduration=5.666282386 podStartE2EDuration="36.376006035s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:24.627485235 +0000 UTC m=+3.114607017" lastFinishedPulling="2026-04-17 07:52:55.33720888 +0000 UTC m=+33.824330666" observedRunningTime="2026-04-17 07:52:58.374487382 +0000 UTC m=+36.861609186" watchObservedRunningTime="2026-04-17 07:52:58.376006035 +0000 UTC m=+36.863127835" Apr 17 07:52:58.663712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:58.663612 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:52:58.663712 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:58.663711 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:52:58.663932 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:58.663766 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:58.663932 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:58.663809 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:58.663932 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:58.663840 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:02.663824631 +0000 UTC m=+41.150946424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:52:58.663932 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:52:58.663856 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:02.663849389 +0000 UTC m=+41.150971169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:52:59.082304 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.082240 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn"] Apr 17 07:52:59.097469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.097443 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs"] Apr 17 07:52:59.097641 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.097557 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.099779 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.099755 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 07:52:59.099912 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.099829 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 07:52:59.099912 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.099755 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mm6vl\"" Apr 17 07:52:59.100745 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.100723 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 07:52:59.100844 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.100728 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 07:52:59.109716 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.109696 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn"] Apr 17 07:52:59.109795 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.109722 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs"] Apr 17 07:52:59.109842 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.109804 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.112058 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.112043 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 07:52:59.267366 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.267332 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcd82\" (UniqueName: \"kubernetes.io/projected/e1e3f9aa-48cf-4782-9812-a4dd634891fe-kube-api-access-mcd82\") pod \"managed-serviceaccount-addon-agent-7545f5564f-jzkjn\" (UID: \"e1e3f9aa-48cf-4782-9812-a4dd634891fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.267558 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.267390 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f95b90db-0860-4924-8687-bea47871abc9-tmp\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.267558 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.267491 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f95b90db-0860-4924-8687-bea47871abc9-klusterlet-config\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.267558 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.267519 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqgwn\" (UniqueName: \"kubernetes.io/projected/f95b90db-0860-4924-8687-bea47871abc9-kube-api-access-lqgwn\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.267678 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.267557 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e1e3f9aa-48cf-4782-9812-a4dd634891fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7545f5564f-jzkjn\" (UID: \"e1e3f9aa-48cf-4782-9812-a4dd634891fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.368055 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.367956 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqgwn\" (UniqueName: \"kubernetes.io/projected/f95b90db-0860-4924-8687-bea47871abc9-kube-api-access-lqgwn\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.368055 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.368026 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e1e3f9aa-48cf-4782-9812-a4dd634891fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7545f5564f-jzkjn\" (UID: \"e1e3f9aa-48cf-4782-9812-a4dd634891fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.368055 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.368054 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcd82\" (UniqueName: \"kubernetes.io/projected/e1e3f9aa-48cf-4782-9812-a4dd634891fe-kube-api-access-mcd82\") pod \"managed-serviceaccount-addon-agent-7545f5564f-jzkjn\" (UID: \"e1e3f9aa-48cf-4782-9812-a4dd634891fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.368620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.368086 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f95b90db-0860-4924-8687-bea47871abc9-tmp\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.368620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.368171 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f95b90db-0860-4924-8687-bea47871abc9-klusterlet-config\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.368690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.368653 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f95b90db-0860-4924-8687-bea47871abc9-tmp\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.371314 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.371292 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e1e3f9aa-48cf-4782-9812-a4dd634891fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7545f5564f-jzkjn\" (UID: \"e1e3f9aa-48cf-4782-9812-a4dd634891fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.371416 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.371295 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f95b90db-0860-4924-8687-bea47871abc9-klusterlet-config\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.376790 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.376764 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcd82\" (UniqueName: \"kubernetes.io/projected/e1e3f9aa-48cf-4782-9812-a4dd634891fe-kube-api-access-mcd82\") pod \"managed-serviceaccount-addon-agent-7545f5564f-jzkjn\" (UID: \"e1e3f9aa-48cf-4782-9812-a4dd634891fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.376889 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.376818 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqgwn\" (UniqueName: \"kubernetes.io/projected/f95b90db-0860-4924-8687-bea47871abc9-kube-api-access-lqgwn\") pod \"klusterlet-addon-workmgr-57f7ff4954-q4sjs\" (UID: \"f95b90db-0860-4924-8687-bea47871abc9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.415123 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.415089 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" Apr 17 07:52:59.421880 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.421854 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:52:59.592283 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.592065 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs"] Apr 17 07:52:59.594922 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:52:59.594898 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn"] Apr 17 07:52:59.601824 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:59.601794 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95b90db_0860_4924_8687_bea47871abc9.slice/crio-aae0ed75de8553bdb53c07a17f5df3e5c13634e67dd9d3aeecaf607ea189c4fa WatchSource:0}: Error finding container aae0ed75de8553bdb53c07a17f5df3e5c13634e67dd9d3aeecaf607ea189c4fa: Status 404 returned error can't find the container with id aae0ed75de8553bdb53c07a17f5df3e5c13634e67dd9d3aeecaf607ea189c4fa Apr 17 07:52:59.602144 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:52:59.602122 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e3f9aa_48cf_4782_9812_a4dd634891fe.slice/crio-acd0656bfb599b5a889ef4408853f5896d597e9290e2de67e5845f06d20ddf81 WatchSource:0}: Error finding container acd0656bfb599b5a889ef4408853f5896d597e9290e2de67e5845f06d20ddf81: Status 404 returned error can't find the container with id acd0656bfb599b5a889ef4408853f5896d597e9290e2de67e5845f06d20ddf81 Apr 17 07:53:00.358299 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:00.358266 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" event={"ID":"f95b90db-0860-4924-8687-bea47871abc9","Type":"ContainerStarted","Data":"aae0ed75de8553bdb53c07a17f5df3e5c13634e67dd9d3aeecaf607ea189c4fa"} Apr 17 07:53:00.359328 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:00.359296 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" event={"ID":"e1e3f9aa-48cf-4782-9812-a4dd634891fe","Type":"ContainerStarted","Data":"acd0656bfb599b5a889ef4408853f5896d597e9290e2de67e5845f06d20ddf81"} Apr 17 07:53:02.693558 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:02.693525 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:53:02.694045 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:02.693572 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:53:02.694045 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:02.693683 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:02.694045 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:02.693730 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:02.694045 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:02.693747 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:10.693733408 +0000 UTC m=+49.180855193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:53:02.694045 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:02.693802 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:10.693789216 +0000 UTC m=+49.180910998 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:53:05.371095 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:05.371060 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" event={"ID":"f95b90db-0860-4924-8687-bea47871abc9","Type":"ContainerStarted","Data":"de89645f5bef3572eb2f905becc3643955ea07d5b0d7234f45b5ad9d758b1511"} Apr 17 07:53:05.371556 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:05.371260 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:53:05.372434 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:05.372410 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" event={"ID":"e1e3f9aa-48cf-4782-9812-a4dd634891fe","Type":"ContainerStarted","Data":"9a5d0fb8ec038865dcf5497cac0858c7459044733273052587eafb8f8694f683"} Apr 17 07:53:05.373202 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:05.373185 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:53:05.386417 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:05.386379 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" podStartSLOduration=1.440293894 podStartE2EDuration="6.386366503s" podCreationTimestamp="2026-04-17 07:52:59 +0000 UTC" firstStartedPulling="2026-04-17 07:52:59.60381755 +0000 UTC m=+38.090939335" lastFinishedPulling="2026-04-17 07:53:04.549890164 +0000 UTC m=+43.037011944" observedRunningTime="2026-04-17 07:53:05.385482239 +0000 UTC m=+43.872604044" watchObservedRunningTime="2026-04-17 07:53:05.386366503 +0000 UTC m=+43.873488305" Apr 17 07:53:05.413256 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:05.413212 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" podStartSLOduration=1.481897851 podStartE2EDuration="6.413199306s" podCreationTimestamp="2026-04-17 07:52:59 +0000 UTC" firstStartedPulling="2026-04-17 07:52:59.604031085 +0000 UTC m=+38.091152884" lastFinishedPulling="2026-04-17 07:53:04.535332545 +0000 UTC m=+43.022454339" observedRunningTime="2026-04-17 07:53:05.41268233 +0000 UTC m=+43.899804142" watchObservedRunningTime="2026-04-17 07:53:05.413199306 +0000 UTC m=+43.900321109" Apr 17 07:53:10.749526 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:10.749488 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:53:10.749943 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:10.749576 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:53:10.749943 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:10.749638 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:10.749943 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:10.749680 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:10.749943 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:10.749726 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.749707894 +0000 UTC m=+65.236829680 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:53:10.749943 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:10.749743 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.749736869 +0000 UTC m=+65.236858649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:53:22.345364 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:22.345336 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zfq9h" Apr 17 07:53:26.849273 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:26.849233 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:53:26.849273 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:26.849282 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:53:26.849735 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:26.849384 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:26.849735 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:26.849408 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:26.849735 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:26.849461 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:58.849445601 +0000 UTC m=+97.336567405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:53:26.849735 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:26.849474 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:58.849467759 +0000 UTC m=+97.336589539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:53:27.755441 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:27.755407 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:53:27.758134 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:27.758112 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:53:27.766053 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:27.766033 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:27.766151 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:27.766101 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:31.766080961 +0000 UTC m=+130.253202745 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : secret "metrics-daemon-secret" not found Apr 17 07:53:27.957531 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:27.957480 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:53:27.960400 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:27.960376 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:53:27.970857 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:27.970834 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:53:27.981940 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:27.981916 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skwl\" (UniqueName: \"kubernetes.io/projected/82c7c47d-33d8-4e71-8695-11aab98b699d-kube-api-access-7skwl\") pod \"network-check-target-vchg7\" (UID: \"82c7c47d-33d8-4e71-8695-11aab98b699d\") " pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:53:28.243885 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:28.241453 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n45c2\"" Apr 17 07:53:28.248914 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:28.248885 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:53:28.371193 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:28.371141 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vchg7"] Apr 17 07:53:28.378469 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:53:28.378431 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c7c47d_33d8_4e71_8695_11aab98b699d.slice/crio-644396990b4a34eea1d249157e9384cbbe68eb26c01e06a5ba4f6f2508ea2fe6 WatchSource:0}: Error finding container 644396990b4a34eea1d249157e9384cbbe68eb26c01e06a5ba4f6f2508ea2fe6: Status 404 returned error can't find the container with id 644396990b4a34eea1d249157e9384cbbe68eb26c01e06a5ba4f6f2508ea2fe6 Apr 17 07:53:28.417091 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:28.417063 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vchg7" event={"ID":"82c7c47d-33d8-4e71-8695-11aab98b699d","Type":"ContainerStarted","Data":"644396990b4a34eea1d249157e9384cbbe68eb26c01e06a5ba4f6f2508ea2fe6"} Apr 17 07:53:31.424634 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:31.424598 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vchg7" event={"ID":"82c7c47d-33d8-4e71-8695-11aab98b699d","Type":"ContainerStarted","Data":"7b33183ac3beb1bedcb72107d8090be4b94a942df683cbc2c8e8d83dda55fbff"} Apr 17 07:53:31.425037 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:31.424758 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:53:31.440932 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:31.440886 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vchg7" podStartSLOduration=66.82023531 podStartE2EDuration="1m9.440870947s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:53:28.381067112 +0000 UTC m=+66.868188893" lastFinishedPulling="2026-04-17 07:53:31.001702735 +0000 UTC m=+69.488824530" observedRunningTime="2026-04-17 07:53:31.440064361 +0000 UTC m=+69.927186189" watchObservedRunningTime="2026-04-17 07:53:31.440870947 +0000 UTC m=+69.927992750" Apr 17 07:53:58.866114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:58.865957 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:53:58.866114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:53:58.866042 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:53:58.866114 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:58.866105 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:58.866114 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:58.866123 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:58.866828 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:58.866171 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls podName:00ef5608-cd73-4a1a-ac02-5e5da7727a5f nodeName:}" failed. No retries permitted until 2026-04-17 07:55:02.866153754 +0000 UTC m=+161.353275534 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls") pod "dns-default-4k2tj" (UID: "00ef5608-cd73-4a1a-ac02-5e5da7727a5f") : secret "dns-default-metrics-tls" not found Apr 17 07:53:58.866828 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:53:58.866185 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert podName:f273b17c-4ccf-45b2-93e0-868dd9134101 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:02.866178863 +0000 UTC m=+161.353300644 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert") pod "ingress-canary-mxsbk" (UID: "f273b17c-4ccf-45b2-93e0-868dd9134101") : secret "canary-serving-cert" not found Apr 17 07:54:02.429250 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:02.429198 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vchg7" Apr 17 07:54:31.785289 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:31.785242 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:54:31.785785 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:31.785357 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:54:31.785785 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:31.785419 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs podName:f6ca1d48-95c2-414b-af4e-838843029028 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:33.785404762 +0000 UTC m=+252.272526544 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs") pod "network-metrics-daemon-k4vcb" (UID: "f6ca1d48-95c2-414b-af4e-838843029028") : secret "metrics-daemon-secret" not found Apr 17 07:54:43.770097 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.770063 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck"] Apr 17 07:54:43.772124 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.772106 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:43.777001 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.776963 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 07:54:43.777092 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.776980 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.777891 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.777698 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-nbt7c\"" Apr 17 07:54:43.778117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.778098 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.796163 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.796143 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck"] Apr 17 07:54:43.867504 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.867484 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:43.867588 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.867513 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpc55\" (UniqueName: \"kubernetes.io/projected/3b35dde2-df58-4352-9c06-578074e85124-kube-api-access-lpc55\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:43.881014 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.880978 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t"] Apr 17 07:54:43.882806 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.882791 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5845f77674-gg4r5"] Apr 17 07:54:43.882940 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.882925 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" Apr 17 07:54:43.884435 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.884419 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.889680 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.889662 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:54:43.889874 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.889861 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:54:43.890002 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.889969 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-cskpl\"" Apr 17 07:54:43.890091 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.890076 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b7smv\"" Apr 17 07:54:43.890280 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.890266 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:54:43.895512 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.895492 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:54:43.895749 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.895734 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t"] Apr 17 07:54:43.901949 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.901928 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5845f77674-gg4r5"] Apr 17 07:54:43.968481 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968454 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-image-registry-private-configuration\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968486 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968598 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968518 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc55\" (UniqueName: \"kubernetes.io/projected/3b35dde2-df58-4352-9c06-578074e85124-kube-api-access-lpc55\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:43.968709 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968594 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfacbb4c-e321-492b-9f6c-f223c66aba6e-ca-trust-extracted\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968709 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968619 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-trusted-ca\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968709 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968639 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-bound-sa-token\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968709 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968665 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89cwl\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-kube-api-access-89cwl\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968861 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968733 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-installation-pull-secrets\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968861 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968767 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skjbw\" (UniqueName: \"kubernetes.io/projected/a03d58cf-86b6-4ec5-be8c-e346b788c3d6-kube-api-access-skjbw\") pod \"network-check-source-8894fc9bd-vlg7t\" (UID: \"a03d58cf-86b6-4ec5-be8c-e346b788c3d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" Apr 17 07:54:43.968861 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968788 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-certificates\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:43.968861 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.968843 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:43.968981 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:43.968936 2560 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:43.969041 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:43.968981 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls podName:3b35dde2-df58-4352-9c06-578074e85124 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:44.468967539 +0000 UTC m=+142.956089323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jczck" (UID: "3b35dde2-df58-4352-9c06-578074e85124") : secret "samples-operator-tls" not found Apr 17 07:54:43.979833 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:43.979807 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpc55\" (UniqueName: \"kubernetes.io/projected/3b35dde2-df58-4352-9c06-578074e85124-kube-api-access-lpc55\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:44.069842 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.069820 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfacbb4c-e321-492b-9f6c-f223c66aba6e-ca-trust-extracted\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.069940 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.069847 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-trusted-ca\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.069940 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.069871 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-bound-sa-token\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.069940 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.069892 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89cwl\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-kube-api-access-89cwl\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.069940 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.069934 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-installation-pull-secrets\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.070183 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.069974 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skjbw\" (UniqueName: \"kubernetes.io/projected/a03d58cf-86b6-4ec5-be8c-e346b788c3d6-kube-api-access-skjbw\") pod \"network-check-source-8894fc9bd-vlg7t\" (UID: \"a03d58cf-86b6-4ec5-be8c-e346b788c3d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" Apr 17 07:54:44.070183 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.070018 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-certificates\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.070183 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.070080 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-image-registry-private-configuration\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.070183 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.070108 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.070377 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.070202 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:44.070377 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.070217 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5845f77674-gg4r5: secret "image-registry-tls" not found Apr 17 07:54:44.070377 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.070282 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls podName:cfacbb4c-e321-492b-9f6c-f223c66aba6e nodeName:}" failed. No retries permitted until 2026-04-17 07:54:44.570264603 +0000 UTC m=+143.057386392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls") pod "image-registry-5845f77674-gg4r5" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e") : secret "image-registry-tls" not found Apr 17 07:54:44.070670 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.070632 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-certificates\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.070769 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.070702 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfacbb4c-e321-492b-9f6c-f223c66aba6e-ca-trust-extracted\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.070860 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.070834 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-trusted-ca\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.072313 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.072291 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-installation-pull-secrets\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.073468 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.073447 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-image-registry-private-configuration\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.082209 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.082189 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89cwl\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-kube-api-access-89cwl\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.082346 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.082328 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-bound-sa-token\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.084658 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.084641 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skjbw\" (UniqueName: \"kubernetes.io/projected/a03d58cf-86b6-4ec5-be8c-e346b788c3d6-kube-api-access-skjbw\") pod \"network-check-source-8894fc9bd-vlg7t\" (UID: \"a03d58cf-86b6-4ec5-be8c-e346b788c3d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" Apr 17 07:54:44.191480 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.191450 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" Apr 17 07:54:44.299833 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.299806 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t"] Apr 17 07:54:44.303530 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:54:44.303502 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03d58cf_86b6_4ec5_be8c_e346b788c3d6.slice/crio-234f166fb01bab0ffc99f400af73d06d3f304edcb8e477d87efae159d5250a42 WatchSource:0}: Error finding container 234f166fb01bab0ffc99f400af73d06d3f304edcb8e477d87efae159d5250a42: Status 404 returned error can't find the container with id 234f166fb01bab0ffc99f400af73d06d3f304edcb8e477d87efae159d5250a42 Apr 17 07:54:44.472421 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.472383 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:44.472580 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.472559 2560 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:44.472645 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.472635 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls podName:3b35dde2-df58-4352-9c06-578074e85124 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:45.472614468 +0000 UTC m=+143.959736257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jczck" (UID: "3b35dde2-df58-4352-9c06-578074e85124") : secret "samples-operator-tls" not found Apr 17 07:54:44.559001 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.558962 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" event={"ID":"a03d58cf-86b6-4ec5-be8c-e346b788c3d6","Type":"ContainerStarted","Data":"783cfc4c74d1fce85da9dd2b0cfb90d66389bb6f283b73096add3369219e2aa1"} Apr 17 07:54:44.559093 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.559008 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" event={"ID":"a03d58cf-86b6-4ec5-be8c-e346b788c3d6","Type":"ContainerStarted","Data":"234f166fb01bab0ffc99f400af73d06d3f304edcb8e477d87efae159d5250a42"} Apr 17 07:54:44.573675 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.573656 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:44.573846 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.573777 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:44.573846 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.573791 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5845f77674-gg4r5: secret "image-registry-tls" not found Apr 17 07:54:44.574001 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:44.573849 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls podName:cfacbb4c-e321-492b-9f6c-f223c66aba6e nodeName:}" failed. No retries permitted until 2026-04-17 07:54:45.573835601 +0000 UTC m=+144.060957382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls") pod "image-registry-5845f77674-gg4r5" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e") : secret "image-registry-tls" not found Apr 17 07:54:44.575005 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:44.574957 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlg7t" podStartSLOduration=1.57494726 podStartE2EDuration="1.57494726s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:44.574091487 +0000 UTC m=+143.061213291" watchObservedRunningTime="2026-04-17 07:54:44.57494726 +0000 UTC m=+143.062069063" Apr 17 07:54:45.479836 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:45.479796 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:45.480290 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:45.479931 2560 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:45.480290 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:45.480025 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls podName:3b35dde2-df58-4352-9c06-578074e85124 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:47.479978906 +0000 UTC m=+145.967100687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jczck" (UID: "3b35dde2-df58-4352-9c06-578074e85124") : secret "samples-operator-tls" not found Apr 17 07:54:45.580704 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:45.580669 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:45.580855 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:45.580767 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:45.580855 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:45.580778 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5845f77674-gg4r5: secret "image-registry-tls" not found Apr 17 07:54:45.580855 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:45.580821 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls podName:cfacbb4c-e321-492b-9f6c-f223c66aba6e nodeName:}" failed. No retries permitted until 2026-04-17 07:54:47.580808139 +0000 UTC m=+146.067929920 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls") pod "image-registry-5845f77674-gg4r5" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e") : secret "image-registry-tls" not found Apr 17 07:54:47.495050 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:47.495009 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:47.495412 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:47.495137 2560 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:47.495412 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:47.495218 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls podName:3b35dde2-df58-4352-9c06-578074e85124 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:51.495201497 +0000 UTC m=+149.982323277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jczck" (UID: "3b35dde2-df58-4352-9c06-578074e85124") : secret "samples-operator-tls" not found Apr 17 07:54:47.595746 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:47.595716 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:47.595906 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:47.595872 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:47.595906 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:47.595889 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5845f77674-gg4r5: secret "image-registry-tls" not found Apr 17 07:54:47.596052 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:47.595949 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls podName:cfacbb4c-e321-492b-9f6c-f223c66aba6e nodeName:}" failed. No retries permitted until 2026-04-17 07:54:51.595929296 +0000 UTC m=+150.083051077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls") pod "image-registry-5845f77674-gg4r5" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e") : secret "image-registry-tls" not found Apr 17 07:54:50.908331 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:50.908304 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zh8tn_e87601ff-22f7-4eb6-bb9e-5d78a6b02e12/dns-node-resolver/0.log" Apr 17 07:54:51.523206 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:51.523153 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:51.523383 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:51.523303 2560 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:51.523383 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:51.523370 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls podName:3b35dde2-df58-4352-9c06-578074e85124 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:59.523355398 +0000 UTC m=+158.010477179 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jczck" (UID: "3b35dde2-df58-4352-9c06-578074e85124") : secret "samples-operator-tls" not found Apr 17 07:54:51.624232 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:51.624202 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:51.624376 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:51.624314 2560 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:54:51.624376 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:51.624325 2560 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5845f77674-gg4r5: secret "image-registry-tls" not found Apr 17 07:54:51.624376 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:51.624375 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls podName:cfacbb4c-e321-492b-9f6c-f223c66aba6e nodeName:}" failed. No retries permitted until 2026-04-17 07:54:59.624363442 +0000 UTC m=+158.111485223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls") pod "image-registry-5845f77674-gg4r5" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e") : secret "image-registry-tls" not found Apr 17 07:54:51.708542 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:51.708520 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8dpsb_03b7cc9b-71c3-4b06-9c37-a26058521703/node-ca/0.log" Apr 17 07:54:57.885818 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:57.885776 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4k2tj" podUID="00ef5608-cd73-4a1a-ac02-5e5da7727a5f" Apr 17 07:54:57.906072 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:57.906052 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mxsbk" podUID="f273b17c-4ccf-45b2-93e0-868dd9134101" Apr 17 07:54:58.586838 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:58.586801 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:54:58.586838 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:58.586815 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4k2tj" Apr 17 07:54:59.143831 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:54:59.143794 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-k4vcb" podUID="f6ca1d48-95c2-414b-af4e-838843029028" Apr 17 07:54:59.581276 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.581248 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:59.583532 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.583501 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b35dde2-df58-4352-9c06-578074e85124-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jczck\" (UID: \"3b35dde2-df58-4352-9c06-578074e85124\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:59.681001 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.680960 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" Apr 17 07:54:59.681783 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.681575 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:59.683784 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.683765 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"image-registry-5845f77674-gg4r5\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:59.793120 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.793093 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck"] Apr 17 07:54:59.797287 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.797266 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:54:59.915361 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:54:59.915331 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5845f77674-gg4r5"] Apr 17 07:54:59.919043 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:54:59.919021 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfacbb4c_e321_492b_9f6c_f223c66aba6e.slice/crio-538ddd79218ca41874f96725a04eccaad5f4aa758085b97969611c76e674288d WatchSource:0}: Error finding container 538ddd79218ca41874f96725a04eccaad5f4aa758085b97969611c76e674288d: Status 404 returned error can't find the container with id 538ddd79218ca41874f96725a04eccaad5f4aa758085b97969611c76e674288d Apr 17 07:55:00.592100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:00.592058 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" event={"ID":"3b35dde2-df58-4352-9c06-578074e85124","Type":"ContainerStarted","Data":"2cae2e1b76ff1f22cca6976343d0b2c77c6954757c9bff0578e8a787aa760b28"} Apr 17 07:55:00.593524 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:00.593493 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" event={"ID":"cfacbb4c-e321-492b-9f6c-f223c66aba6e","Type":"ContainerStarted","Data":"897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc"} Apr 17 07:55:00.593650 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:00.593528 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" event={"ID":"cfacbb4c-e321-492b-9f6c-f223c66aba6e","Type":"ContainerStarted","Data":"538ddd79218ca41874f96725a04eccaad5f4aa758085b97969611c76e674288d"} Apr 17 07:55:00.593650 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:00.593640 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:55:00.612116 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:00.612069 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" podStartSLOduration=17.612056892 podStartE2EDuration="17.612056892s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:55:00.61086116 +0000 UTC m=+159.097982976" watchObservedRunningTime="2026-04-17 07:55:00.612056892 +0000 UTC m=+159.099178694" Apr 17 07:55:01.598127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:01.598079 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" event={"ID":"3b35dde2-df58-4352-9c06-578074e85124","Type":"ContainerStarted","Data":"3dc0e0bca215dcfbf95b2827445c1b26c3f74f337261932b5ce3a03f7092cbcf"} Apr 17 07:55:01.598127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:01.598132 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" event={"ID":"3b35dde2-df58-4352-9c06-578074e85124","Type":"ContainerStarted","Data":"84a9155d74e7abc1ee53a61bf874fd0f6ef8d99b5ca2ae0d3cdb29e69237fa33"} Apr 17 07:55:01.613909 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:01.613828 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jczck" podStartSLOduration=17.088453499 podStartE2EDuration="18.61381096s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:59.853499606 +0000 UTC m=+158.340621401" lastFinishedPulling="2026-04-17 07:55:01.378857077 +0000 UTC m=+159.865978862" observedRunningTime="2026-04-17 07:55:01.613652964 +0000 UTC m=+160.100774779" watchObservedRunningTime="2026-04-17 07:55:01.61381096 +0000 UTC m=+160.100932779" Apr 17 07:55:02.907950 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:02.907907 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:55:02.908374 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:02.907971 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:55:02.910295 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:02.910269 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ef5608-cd73-4a1a-ac02-5e5da7727a5f-metrics-tls\") pod \"dns-default-4k2tj\" (UID: \"00ef5608-cd73-4a1a-ac02-5e5da7727a5f\") " pod="openshift-dns/dns-default-4k2tj" Apr 17 07:55:02.910406 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:02.910387 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f273b17c-4ccf-45b2-93e0-868dd9134101-cert\") pod \"ingress-canary-mxsbk\" (UID: \"f273b17c-4ccf-45b2-93e0-868dd9134101\") " pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:55:03.090071 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.090043 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d4l2f\"" Apr 17 07:55:03.090941 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.090925 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fmxgb\"" Apr 17 07:55:03.097883 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.097853 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mxsbk" Apr 17 07:55:03.097952 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.097934 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4k2tj" Apr 17 07:55:03.224197 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.224020 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4k2tj"] Apr 17 07:55:03.226572 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:03.226540 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ef5608_cd73_4a1a_ac02_5e5da7727a5f.slice/crio-2150edd01d6778438cb900b3d8c329a64a9515ff2f6f578ba576aeb4b51f99a7 WatchSource:0}: Error finding container 2150edd01d6778438cb900b3d8c329a64a9515ff2f6f578ba576aeb4b51f99a7: Status 404 returned error can't find the container with id 2150edd01d6778438cb900b3d8c329a64a9515ff2f6f578ba576aeb4b51f99a7 Apr 17 07:55:03.238590 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.238563 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mxsbk"] Apr 17 07:55:03.241393 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:03.241369 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf273b17c_4ccf_45b2_93e0_868dd9134101.slice/crio-61ce305eca67e00f4ecb0265950d47b803ca738886d680bc6f2307594b753738 WatchSource:0}: Error finding container 61ce305eca67e00f4ecb0265950d47b803ca738886d680bc6f2307594b753738: Status 404 returned error can't find the container with id 61ce305eca67e00f4ecb0265950d47b803ca738886d680bc6f2307594b753738 Apr 17 07:55:03.605196 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.605148 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mxsbk" event={"ID":"f273b17c-4ccf-45b2-93e0-868dd9134101","Type":"ContainerStarted","Data":"61ce305eca67e00f4ecb0265950d47b803ca738886d680bc6f2307594b753738"} Apr 17 07:55:03.606284 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:03.606242 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4k2tj" event={"ID":"00ef5608-cd73-4a1a-ac02-5e5da7727a5f","Type":"ContainerStarted","Data":"2150edd01d6778438cb900b3d8c329a64a9515ff2f6f578ba576aeb4b51f99a7"} Apr 17 07:55:05.372217 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.372157 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" podUID="f95b90db-0860-4924-8687-bea47871abc9" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.8:8000/readyz\": dial tcp 10.134.0.8:8000: connect: connection refused" Apr 17 07:55:05.612819 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.612784 2560 generic.go:358] "Generic (PLEG): container finished" podID="f95b90db-0860-4924-8687-bea47871abc9" containerID="de89645f5bef3572eb2f905becc3643955ea07d5b0d7234f45b5ad9d758b1511" exitCode=1 Apr 17 07:55:05.613016 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.612857 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" event={"ID":"f95b90db-0860-4924-8687-bea47871abc9","Type":"ContainerDied","Data":"de89645f5bef3572eb2f905becc3643955ea07d5b0d7234f45b5ad9d758b1511"} Apr 17 07:55:05.613252 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.613235 2560 scope.go:117] "RemoveContainer" containerID="de89645f5bef3572eb2f905becc3643955ea07d5b0d7234f45b5ad9d758b1511" Apr 17 07:55:05.614185 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.614164 2560 generic.go:358] "Generic (PLEG): container finished" podID="e1e3f9aa-48cf-4782-9812-a4dd634891fe" containerID="9a5d0fb8ec038865dcf5497cac0858c7459044733273052587eafb8f8694f683" exitCode=255 Apr 17 07:55:05.614267 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.614229 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" event={"ID":"e1e3f9aa-48cf-4782-9812-a4dd634891fe","Type":"ContainerDied","Data":"9a5d0fb8ec038865dcf5497cac0858c7459044733273052587eafb8f8694f683"} Apr 17 07:55:05.615650 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.615624 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mxsbk" event={"ID":"f273b17c-4ccf-45b2-93e0-868dd9134101","Type":"ContainerStarted","Data":"b23a27b29a15a5b3157c8692a4071a177d198922458cacdb90808abd88fef0a1"} Apr 17 07:55:05.617219 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.617200 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4k2tj" event={"ID":"00ef5608-cd73-4a1a-ac02-5e5da7727a5f","Type":"ContainerStarted","Data":"1505ea86422e8bcdd748fa16c2d6df88426b8034373b928444e6fcdc16541801"} Apr 17 07:55:05.617304 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.617225 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4k2tj" event={"ID":"00ef5608-cd73-4a1a-ac02-5e5da7727a5f","Type":"ContainerStarted","Data":"ebbb57d9f15bf490ef60756a71586d98d7b96b47286d5cb89ac19efd2c063447"} Apr 17 07:55:05.617370 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.617355 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4k2tj" Apr 17 07:55:05.621711 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.621693 2560 scope.go:117] "RemoveContainer" containerID="9a5d0fb8ec038865dcf5497cac0858c7459044733273052587eafb8f8694f683" Apr 17 07:55:05.654652 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.654614 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mxsbk" podStartSLOduration=129.899264363 podStartE2EDuration="2m11.654599142s" podCreationTimestamp="2026-04-17 07:52:54 +0000 UTC" firstStartedPulling="2026-04-17 07:55:03.243097331 +0000 UTC m=+161.730219112" lastFinishedPulling="2026-04-17 07:55:04.998432096 +0000 UTC m=+163.485553891" observedRunningTime="2026-04-17 07:55:05.65348659 +0000 UTC m=+164.140608395" watchObservedRunningTime="2026-04-17 07:55:05.654599142 +0000 UTC m=+164.141720945" Apr 17 07:55:05.685588 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:05.685196 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4k2tj" podStartSLOduration=129.917372108 podStartE2EDuration="2m11.685177795s" podCreationTimestamp="2026-04-17 07:52:54 +0000 UTC" firstStartedPulling="2026-04-17 07:55:03.228503976 +0000 UTC m=+161.715625756" lastFinishedPulling="2026-04-17 07:55:04.996309656 +0000 UTC m=+163.483431443" observedRunningTime="2026-04-17 07:55:05.685059107 +0000 UTC m=+164.172180910" watchObservedRunningTime="2026-04-17 07:55:05.685177795 +0000 UTC m=+164.172299603" Apr 17 07:55:06.621533 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:06.621500 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" event={"ID":"f95b90db-0860-4924-8687-bea47871abc9","Type":"ContainerStarted","Data":"7ac187f82bace7bbb3d2b54e187c861a07e1f2f86cbe1a17853f7ee371880a64"} Apr 17 07:55:06.622012 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:06.621788 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:55:06.622536 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:06.622513 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f7ff4954-q4sjs" Apr 17 07:55:06.623234 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:06.623204 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7545f5564f-jzkjn" event={"ID":"e1e3f9aa-48cf-4782-9812-a4dd634891fe","Type":"ContainerStarted","Data":"2e48535d8e3edd587c6c91d8adf34637b6ada39240a16dcd537dad85bab0125b"} Apr 17 07:55:11.979591 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.979513 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5845f77674-gg4r5"] Apr 17 07:55:11.992354 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.992325 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4j7xs"] Apr 17 07:55:11.994624 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.994602 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:11.997613 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.997594 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:55:11.997708 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.997632 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:55:11.998210 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.998187 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-54hk8\"" Apr 17 07:55:11.998303 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.998237 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:55:11.998623 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:11.998610 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:55:12.009826 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.009800 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4j7xs"] Apr 17 07:55:12.023411 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.023387 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b9ddc8f4-p749g"] Apr 17 07:55:12.025458 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.025433 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.046583 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.046553 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b9ddc8f4-p749g"] Apr 17 07:55:12.178106 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178070 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ad78d61-0cc9-46c4-84cd-f522e68f9763-image-registry-private-configuration\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178106 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178112 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/23614c33-d1d2-4da0-8603-df308834ff05-data-volume\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.178327 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178133 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ad78d61-0cc9-46c4-84cd-f522e68f9763-registry-certificates\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178327 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178181 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/23614c33-d1d2-4da0-8603-df308834ff05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.178327 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178261 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ad78d61-0cc9-46c4-84cd-f522e68f9763-installation-pull-secrets\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178327 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178293 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-registry-tls\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178327 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178312 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/23614c33-d1d2-4da0-8603-df308834ff05-crio-socket\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.178509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178329 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/23614c33-d1d2-4da0-8603-df308834ff05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.178509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178355 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-bound-sa-token\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178427 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ad78d61-0cc9-46c4-84cd-f522e68f9763-ca-trust-extracted\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178466 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad78d61-0cc9-46c4-84cd-f522e68f9763-trusted-ca\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.178509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178482 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s74k8\" (UniqueName: \"kubernetes.io/projected/23614c33-d1d2-4da0-8603-df308834ff05-kube-api-access-s74k8\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.178509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.178507 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7kf\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-kube-api-access-xs7kf\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279235 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279147 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ad78d61-0cc9-46c4-84cd-f522e68f9763-image-registry-private-configuration\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279235 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279197 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/23614c33-d1d2-4da0-8603-df308834ff05-data-volume\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.279235 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279218 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ad78d61-0cc9-46c4-84cd-f522e68f9763-registry-certificates\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279236 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/23614c33-d1d2-4da0-8603-df308834ff05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279271 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ad78d61-0cc9-46c4-84cd-f522e68f9763-installation-pull-secrets\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279301 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-registry-tls\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279325 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/23614c33-d1d2-4da0-8603-df308834ff05-crio-socket\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279349 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/23614c33-d1d2-4da0-8603-df308834ff05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279373 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-bound-sa-token\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279417 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ad78d61-0cc9-46c4-84cd-f522e68f9763-ca-trust-extracted\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279455 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad78d61-0cc9-46c4-84cd-f522e68f9763-trusted-ca\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279463 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/23614c33-d1d2-4da0-8603-df308834ff05-crio-socket\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.279488 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279477 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s74k8\" (UniqueName: \"kubernetes.io/projected/23614c33-d1d2-4da0-8603-df308834ff05-kube-api-access-s74k8\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.279968 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279570 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7kf\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-kube-api-access-xs7kf\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.279968 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.279664 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/23614c33-d1d2-4da0-8603-df308834ff05-data-volume\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.280106 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.280001 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ad78d61-0cc9-46c4-84cd-f522e68f9763-ca-trust-extracted\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.280106 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.280014 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/23614c33-d1d2-4da0-8603-df308834ff05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.280726 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.280699 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ad78d61-0cc9-46c4-84cd-f522e68f9763-registry-certificates\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.280882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.280750 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad78d61-0cc9-46c4-84cd-f522e68f9763-trusted-ca\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.281946 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.281923 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ad78d61-0cc9-46c4-84cd-f522e68f9763-image-registry-private-configuration\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.282054 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.282022 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-registry-tls\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.282227 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.282210 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ad78d61-0cc9-46c4-84cd-f522e68f9763-installation-pull-secrets\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.282417 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.282398 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/23614c33-d1d2-4da0-8603-df308834ff05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.288046 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.288021 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s74k8\" (UniqueName: \"kubernetes.io/projected/23614c33-d1d2-4da0-8603-df308834ff05-kube-api-access-s74k8\") pod \"insights-runtime-extractor-4j7xs\" (UID: \"23614c33-d1d2-4da0-8603-df308834ff05\") " pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.288221 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.288199 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7kf\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-kube-api-access-xs7kf\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.288301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.288238 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ad78d61-0cc9-46c4-84cd-f522e68f9763-bound-sa-token\") pod \"image-registry-5b9ddc8f4-p749g\" (UID: \"3ad78d61-0cc9-46c4-84cd-f522e68f9763\") " pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.304025 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.304000 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4j7xs" Apr 17 07:55:12.333626 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.333595 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.434236 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.434209 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4j7xs"] Apr 17 07:55:12.465825 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.465797 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b9ddc8f4-p749g"] Apr 17 07:55:12.469785 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:12.469758 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad78d61_0cc9_46c4_84cd_f522e68f9763.slice/crio-a55fc7a597dcddee1c49245c85d6fac853947e8c9fac887f7f4e324451e902cd WatchSource:0}: Error finding container a55fc7a597dcddee1c49245c85d6fac853947e8c9fac887f7f4e324451e902cd: Status 404 returned error can't find the container with id a55fc7a597dcddee1c49245c85d6fac853947e8c9fac887f7f4e324451e902cd Apr 17 07:55:12.640322 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.640283 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4j7xs" event={"ID":"23614c33-d1d2-4da0-8603-df308834ff05","Type":"ContainerStarted","Data":"1b2212eec15b6ca50e8697e220bbda1969c69c8cdb69b3dfeeef9c9917c96671"} Apr 17 07:55:12.640322 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.640327 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4j7xs" event={"ID":"23614c33-d1d2-4da0-8603-df308834ff05","Type":"ContainerStarted","Data":"64512706ee48eaa3e41902549770c5ceb65c5eb8bb783e597ab2b189112ac49d"} Apr 17 07:55:12.641708 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.641685 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" event={"ID":"3ad78d61-0cc9-46c4-84cd-f522e68f9763","Type":"ContainerStarted","Data":"2497d68ebd7afa1a7c24fd5385414198fc7de0100146603077394a5dfb8e1334"} Apr 17 07:55:12.641819 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.641714 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" event={"ID":"3ad78d61-0cc9-46c4-84cd-f522e68f9763","Type":"ContainerStarted","Data":"a55fc7a597dcddee1c49245c85d6fac853947e8c9fac887f7f4e324451e902cd"} Apr 17 07:55:12.641819 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.641801 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:12.660069 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:12.660021 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" podStartSLOduration=1.660006726 podStartE2EDuration="1.660006726s" podCreationTimestamp="2026-04-17 07:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:55:12.659205729 +0000 UTC m=+171.146327535" watchObservedRunningTime="2026-04-17 07:55:12.660006726 +0000 UTC m=+171.147128520" Apr 17 07:55:13.128079 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:13.128046 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:55:13.645740 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:13.645700 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4j7xs" event={"ID":"23614c33-d1d2-4da0-8603-df308834ff05","Type":"ContainerStarted","Data":"0dc40bd24cf1eaa9656fea728b7c32510e9df39b9fe1baa21dc98ece7fcf7180"} Apr 17 07:55:14.650324 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:14.650278 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4j7xs" event={"ID":"23614c33-d1d2-4da0-8603-df308834ff05","Type":"ContainerStarted","Data":"6485efeeecd33596ca5255540a3183295c1c753ccdc91476d8f9052a1e4cbba8"} Apr 17 07:55:14.666876 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:14.666802 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4j7xs" podStartSLOduration=1.7306412500000001 podStartE2EDuration="3.666786298s" podCreationTimestamp="2026-04-17 07:55:11 +0000 UTC" firstStartedPulling="2026-04-17 07:55:12.502972619 +0000 UTC m=+170.990094414" lastFinishedPulling="2026-04-17 07:55:14.439117663 +0000 UTC m=+172.926239462" observedRunningTime="2026-04-17 07:55:14.666602325 +0000 UTC m=+173.153724128" watchObservedRunningTime="2026-04-17 07:55:14.666786298 +0000 UTC m=+173.153908108" Apr 17 07:55:15.625468 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:15.625428 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4k2tj" Apr 17 07:55:21.984560 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:21.984531 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:55:25.377564 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.377535 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-b57gm"] Apr 17 07:55:25.384093 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.384072 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.387968 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.387947 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:55:25.388115 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.387947 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 07:55:25.388182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.387947 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:55:25.388182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.387946 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 07:55:25.388286 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.388022 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:55:25.388286 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.388034 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:55:25.388286 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.388036 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-cdj5w\"" Apr 17 07:55:25.390628 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.390598 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4cwcz"] Apr 17 07:55:25.394121 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.394103 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.394925 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.394907 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-b57gm"] Apr 17 07:55:25.396278 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.396258 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:55:25.396352 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.396264 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:55:25.396861 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.396845 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bpbfn\"" Apr 17 07:55:25.397124 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.397105 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:55:25.474693 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474655 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79pt\" (UniqueName: \"kubernetes.io/projected/749cabee-7543-4558-9e74-fbd5becf3299-kube-api-access-g79pt\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.474693 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474695 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-textfile\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474712 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474732 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-root\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474749 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474819 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrk7j\" (UniqueName: \"kubernetes.io/projected/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-api-access-wrk7j\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474855 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-accelerators-collector-config\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474873 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.474913 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474890 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/749cabee-7543-4558-9e74-fbd5becf3299-metrics-client-ca\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.475182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474917 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.475182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474948 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-tls\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.475182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.474970 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-sys\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.475182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.475007 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.475182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.475035 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.475182 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.475053 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-wtmp\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.575937 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.575901 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrk7j\" (UniqueName: \"kubernetes.io/projected/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-api-access-wrk7j\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.575937 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.575938 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-accelerators-collector-config\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.575959 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.575976 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/749cabee-7543-4558-9e74-fbd5becf3299-metrics-client-ca\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576018 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576050 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-tls\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576075 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-sys\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576093 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576113 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.576178 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576144 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-wtmp\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576582 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576186 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g79pt\" (UniqueName: \"kubernetes.io/projected/749cabee-7543-4558-9e74-fbd5becf3299-kube-api-access-g79pt\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576582 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576220 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-textfile\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576582 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576229 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-sys\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576582 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576244 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.576582 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576284 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-root\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576582 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576312 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576864 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576659 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-accelerators-collector-config\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576864 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576698 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/749cabee-7543-4558-9e74-fbd5becf3299-metrics-client-ca\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576864 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576736 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-root\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.576864 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576858 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.577098 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.576860 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-wtmp\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.577284 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.577259 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-textfile\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.577475 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.577455 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.577475 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.577467 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.578639 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.578615 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-tls\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.578753 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.578687 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.578851 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.578833 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/749cabee-7543-4558-9e74-fbd5becf3299-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.578947 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.578930 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.585785 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.585760 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrk7j\" (UniqueName: \"kubernetes.io/projected/26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f-kube-api-access-wrk7j\") pod \"kube-state-metrics-69db897b98-b57gm\" (UID: \"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.586401 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.586379 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79pt\" (UniqueName: \"kubernetes.io/projected/749cabee-7543-4558-9e74-fbd5becf3299-kube-api-access-g79pt\") pod \"node-exporter-4cwcz\" (UID: \"749cabee-7543-4558-9e74-fbd5becf3299\") " pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.693249 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.693171 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" Apr 17 07:55:25.702821 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.702698 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4cwcz" Apr 17 07:55:25.713470 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:25.713435 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749cabee_7543_4558_9e74_fbd5becf3299.slice/crio-f1082ce759619175adb18e908fa924af97f4ce9b369e9bb08fff093bf5b58026 WatchSource:0}: Error finding container f1082ce759619175adb18e908fa924af97f4ce9b369e9bb08fff093bf5b58026: Status 404 returned error can't find the container with id f1082ce759619175adb18e908fa924af97f4ce9b369e9bb08fff093bf5b58026 Apr 17 07:55:25.811667 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:25.811636 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-b57gm"] Apr 17 07:55:25.814704 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:25.814675 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26cf8eb2_838b_43fd_8fd1_d6b3cf466b8f.slice/crio-30090395dcca6eb566e2ce3047056cfad590e8daa759fb854ec22ad2c3d2f14e WatchSource:0}: Error finding container 30090395dcca6eb566e2ce3047056cfad590e8daa759fb854ec22ad2c3d2f14e: Status 404 returned error can't find the container with id 30090395dcca6eb566e2ce3047056cfad590e8daa759fb854ec22ad2c3d2f14e Apr 17 07:55:26.462867 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.462834 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:55:26.485481 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.485450 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:55:26.485656 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.485640 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.489101 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.488762 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 07:55:26.489101 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489031 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 07:55:26.489101 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489069 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 07:55:26.489398 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489296 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 07:55:26.489398 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489354 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 07:55:26.489398 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489354 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 07:55:26.489576 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489493 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 07:55:26.489576 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489493 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 07:55:26.490004 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489831 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 07:55:26.490004 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.489938 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-w46vb\"" Apr 17 07:55:26.584664 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.584607 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.584664 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.584663 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.584733 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpfgf\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-kube-api-access-zpfgf\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.584834 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-web-config\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.584869 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.584920 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-out\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585019 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585127 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585057 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585442 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585114 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585442 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585230 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585442 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585261 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585442 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585296 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.585442 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.585325 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.680765 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.680733 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4cwcz" event={"ID":"749cabee-7543-4558-9e74-fbd5becf3299","Type":"ContainerStarted","Data":"f1082ce759619175adb18e908fa924af97f4ce9b369e9bb08fff093bf5b58026"} Apr 17 07:55:26.681835 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.681806 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" event={"ID":"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f","Type":"ContainerStarted","Data":"30090395dcca6eb566e2ce3047056cfad590e8daa759fb854ec22ad2c3d2f14e"} Apr 17 07:55:26.686233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686212 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686343 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686245 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686343 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686280 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686343 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686305 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686343 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686326 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpfgf\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-kube-api-access-zpfgf\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686546 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686436 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-web-config\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686546 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686486 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686546 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686527 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-out\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686702 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686581 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686702 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686612 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686702 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686667 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686848 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686727 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.686848 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:55:26.686769 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle podName:7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec nodeName:}" failed. No retries permitted until 2026-04-17 07:55:27.186747062 +0000 UTC m=+185.673868847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec") : configmap references non-existent config key: ca-bundle.crt Apr 17 07:55:26.686848 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.686798 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.687022 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:55:26.686924 2560 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 17 07:55:26.687022 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:55:26.686981 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls podName:7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec nodeName:}" failed. No retries permitted until 2026-04-17 07:55:27.186961632 +0000 UTC m=+185.674083423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec") : secret "alertmanager-main-tls" not found Apr 17 07:55:26.687140 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.687069 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.688042 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.688015 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.689621 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.689562 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.689621 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.689593 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.690094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.690057 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.690094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.690060 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.690591 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.690546 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-out\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.690683 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.690618 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.690683 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.690670 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-web-config\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.691132 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.691115 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:26.695027 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:26.694982 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpfgf\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-kube-api-access-zpfgf\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:27.190956 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.190915 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:27.191148 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.190978 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:27.191741 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.191719 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:27.193347 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.193328 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:27.398667 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.398560 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:55:27.557928 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.557895 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:55:27.561871 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:27.561838 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7309d7e6_66ac_41d9_b7ce_d64b60b8e5ec.slice/crio-4359ca0356bdcecdd11d2147e57c0bbbb788fbb4ccd6eaf3efc513a4ab8faf47 WatchSource:0}: Error finding container 4359ca0356bdcecdd11d2147e57c0bbbb788fbb4ccd6eaf3efc513a4ab8faf47: Status 404 returned error can't find the container with id 4359ca0356bdcecdd11d2147e57c0bbbb788fbb4ccd6eaf3efc513a4ab8faf47 Apr 17 07:55:27.686079 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.686040 2560 generic.go:358] "Generic (PLEG): container finished" podID="749cabee-7543-4558-9e74-fbd5becf3299" containerID="37519adcc1fb53f1dd8e1dca9616dece7a7e9c5334dbc05dca1238d2d6639850" exitCode=0 Apr 17 07:55:27.686268 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.686122 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4cwcz" event={"ID":"749cabee-7543-4558-9e74-fbd5becf3299","Type":"ContainerDied","Data":"37519adcc1fb53f1dd8e1dca9616dece7a7e9c5334dbc05dca1238d2d6639850"} Apr 17 07:55:27.688150 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.688120 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" event={"ID":"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f","Type":"ContainerStarted","Data":"f77ad42e7559b82c673c50ba345bd2b267619d4f3de3e496c55a42b3ee8afa47"} Apr 17 07:55:27.688271 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.688159 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" event={"ID":"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f","Type":"ContainerStarted","Data":"ff1a664f68c27317cf9a6d620974cf95b99f94dde465db34d2195322ccc619b3"} Apr 17 07:55:27.688271 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.688173 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" event={"ID":"26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f","Type":"ContainerStarted","Data":"419561ea56344db719e752c46beb393de79eb28d5331e32f22a0e5184362b97b"} Apr 17 07:55:27.689203 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.689179 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"4359ca0356bdcecdd11d2147e57c0bbbb788fbb4ccd6eaf3efc513a4ab8faf47"} Apr 17 07:55:27.722291 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:27.722248 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-b57gm" podStartSLOduration=1.27458418 podStartE2EDuration="2.722234082s" podCreationTimestamp="2026-04-17 07:55:25 +0000 UTC" firstStartedPulling="2026-04-17 07:55:25.816657535 +0000 UTC m=+184.303779316" lastFinishedPulling="2026-04-17 07:55:27.264307421 +0000 UTC m=+185.751429218" observedRunningTime="2026-04-17 07:55:27.721675569 +0000 UTC m=+186.208797385" watchObservedRunningTime="2026-04-17 07:55:27.722234082 +0000 UTC m=+186.209355885" Apr 17 07:55:28.694498 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:28.694462 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4cwcz" event={"ID":"749cabee-7543-4558-9e74-fbd5becf3299","Type":"ContainerStarted","Data":"9b7e1653d912906690aee60d9f414d8edcf7d4893a5bc751b9820b89c0598af7"} Apr 17 07:55:28.694943 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:28.694505 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4cwcz" event={"ID":"749cabee-7543-4558-9e74-fbd5becf3299","Type":"ContainerStarted","Data":"6fdfc812d9c780be1ad019666f49b844828f715dd620542476f31756be3254bf"} Apr 17 07:55:28.695945 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:28.695915 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242" exitCode=0 Apr 17 07:55:28.696063 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:28.696036 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242"} Apr 17 07:55:28.712450 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:28.712323 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4cwcz" podStartSLOduration=2.167240918 podStartE2EDuration="3.71230466s" podCreationTimestamp="2026-04-17 07:55:25 +0000 UTC" firstStartedPulling="2026-04-17 07:55:25.715884083 +0000 UTC m=+184.203005868" lastFinishedPulling="2026-04-17 07:55:27.260947825 +0000 UTC m=+185.748069610" observedRunningTime="2026-04-17 07:55:28.711968935 +0000 UTC m=+187.199090739" watchObservedRunningTime="2026-04-17 07:55:28.71230466 +0000 UTC m=+187.199426466" Apr 17 07:55:30.704229 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:30.704154 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e"} Apr 17 07:55:30.704229 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:30.704189 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df"} Apr 17 07:55:30.704229 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:30.704198 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df"} Apr 17 07:55:30.704229 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:30.704208 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f"} Apr 17 07:55:30.704229 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:30.704217 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048"} Apr 17 07:55:31.710609 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:31.710569 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerStarted","Data":"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb"} Apr 17 07:55:31.740801 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:31.740753 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.12104619 podStartE2EDuration="5.740738798s" podCreationTimestamp="2026-04-17 07:55:26 +0000 UTC" firstStartedPulling="2026-04-17 07:55:27.563854316 +0000 UTC m=+186.050976103" lastFinishedPulling="2026-04-17 07:55:31.183546917 +0000 UTC m=+189.670668711" observedRunningTime="2026-04-17 07:55:31.740247299 +0000 UTC m=+190.227369103" watchObservedRunningTime="2026-04-17 07:55:31.740738798 +0000 UTC m=+190.227860601" Apr 17 07:55:33.649641 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:33.649610 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b9ddc8f4-p749g" Apr 17 07:55:35.429145 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.429110 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b6bb9c595-qc478"] Apr 17 07:55:35.432319 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.432297 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.435734 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435704 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:55:35.435873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435759 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:55:35.435873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435770 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:55:35.435873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435778 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:55:35.435873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435846 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j6lrp\"" Apr 17 07:55:35.435873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435852 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:55:35.435873 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435786 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:55:35.436118 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.435971 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:55:35.445370 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.445352 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b6bb9c595-qc478"] Apr 17 07:55:35.463027 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.462979 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-oauth-serving-cert\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.463162 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.463050 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-console-config\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.463162 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.463067 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcxn\" (UniqueName: \"kubernetes.io/projected/d1f86369-8a7f-4a7c-a739-a83856871552-kube-api-access-jvcxn\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.463162 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.463095 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-service-ca\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.463283 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.463190 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-oauth-config\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.463283 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.463217 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-serving-cert\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564228 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564188 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-oauth-serving-cert\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564426 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564252 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-console-config\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564426 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564272 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcxn\" (UniqueName: \"kubernetes.io/projected/d1f86369-8a7f-4a7c-a739-a83856871552-kube-api-access-jvcxn\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564426 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564299 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-service-ca\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564426 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564346 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-oauth-config\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564426 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564362 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-serving-cert\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.564950 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.564920 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-oauth-serving-cert\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.565539 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.565515 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-service-ca\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.565630 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.565609 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-console-config\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.566728 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.566713 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-oauth-config\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.566833 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.566816 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-serving-cert\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.574621 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.574594 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcxn\" (UniqueName: \"kubernetes.io/projected/d1f86369-8a7f-4a7c-a739-a83856871552-kube-api-access-jvcxn\") pod \"console-6b6bb9c595-qc478\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.741578 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.741496 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:35.859772 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:35.859739 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b6bb9c595-qc478"] Apr 17 07:55:35.862688 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:55:35.862662 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1f86369_8a7f_4a7c_a739_a83856871552.slice/crio-88fe811236efd1860964fba273ee442d812546c587c57149b23ff617ed761239 WatchSource:0}: Error finding container 88fe811236efd1860964fba273ee442d812546c587c57149b23ff617ed761239: Status 404 returned error can't find the container with id 88fe811236efd1860964fba273ee442d812546c587c57149b23ff617ed761239 Apr 17 07:55:36.728489 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:36.728439 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b6bb9c595-qc478" event={"ID":"d1f86369-8a7f-4a7c-a739-a83856871552","Type":"ContainerStarted","Data":"88fe811236efd1860964fba273ee442d812546c587c57149b23ff617ed761239"} Apr 17 07:55:36.998883 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:36.998786 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" podUID="cfacbb4c-e321-492b-9f6c-f223c66aba6e" containerName="registry" containerID="cri-o://897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc" gracePeriod=30 Apr 17 07:55:37.246366 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.246338 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:55:37.278358 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278280 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-image-registry-private-configuration\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278358 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278324 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-installation-pull-secrets\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278358 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278359 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-bound-sa-token\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278432 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89cwl\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-kube-api-access-89cwl\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278463 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-certificates\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278512 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfacbb4c-e321-492b-9f6c-f223c66aba6e-ca-trust-extracted\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278579 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.278620 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.278611 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-trusted-ca\") pod \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\" (UID: \"cfacbb4c-e321-492b-9f6c-f223c66aba6e\") " Apr 17 07:55:37.279203 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.279152 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:37.279303 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.279254 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:37.281422 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.281361 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:37.281422 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.281366 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:37.281782 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.281759 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:37.282006 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.281951 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-kube-api-access-89cwl" (OuterVolumeSpecName: "kube-api-access-89cwl") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "kube-api-access-89cwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:37.282887 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.282862 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:37.290113 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.290090 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfacbb4c-e321-492b-9f6c-f223c66aba6e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cfacbb4c-e321-492b-9f6c-f223c66aba6e" (UID: "cfacbb4c-e321-492b-9f6c-f223c66aba6e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:55:37.380084 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380052 2560 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380084 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380080 2560 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-trusted-ca\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380084 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380090 2560 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-image-registry-private-configuration\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380101 2560 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfacbb4c-e321-492b-9f6c-f223c66aba6e-installation-pull-secrets\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380113 2560 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-bound-sa-token\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380122 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89cwl\" (UniqueName: \"kubernetes.io/projected/cfacbb4c-e321-492b-9f6c-f223c66aba6e-kube-api-access-89cwl\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380131 2560 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfacbb4c-e321-492b-9f6c-f223c66aba6e-registry-certificates\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.380301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.380139 2560 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfacbb4c-e321-492b-9f6c-f223c66aba6e-ca-trust-extracted\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:55:37.737244 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.737211 2560 generic.go:358] "Generic (PLEG): container finished" podID="cfacbb4c-e321-492b-9f6c-f223c66aba6e" containerID="897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc" exitCode=0 Apr 17 07:55:37.737701 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.737266 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" event={"ID":"cfacbb4c-e321-492b-9f6c-f223c66aba6e","Type":"ContainerDied","Data":"897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc"} Apr 17 07:55:37.737701 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.737290 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" event={"ID":"cfacbb4c-e321-492b-9f6c-f223c66aba6e","Type":"ContainerDied","Data":"538ddd79218ca41874f96725a04eccaad5f4aa758085b97969611c76e674288d"} Apr 17 07:55:37.737701 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.737310 2560 scope.go:117] "RemoveContainer" containerID="897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc" Apr 17 07:55:37.737701 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.737319 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5845f77674-gg4r5" Apr 17 07:55:37.758737 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.758709 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5845f77674-gg4r5"] Apr 17 07:55:37.763658 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:37.763633 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5845f77674-gg4r5"] Apr 17 07:55:38.133412 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:38.133371 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfacbb4c-e321-492b-9f6c-f223c66aba6e" path="/var/lib/kubelet/pods/cfacbb4c-e321-492b-9f6c-f223c66aba6e/volumes" Apr 17 07:55:38.380257 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:38.380206 2560 scope.go:117] "RemoveContainer" containerID="897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc" Apr 17 07:55:38.380585 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:55:38.380564 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc\": container with ID starting with 897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc not found: ID does not exist" containerID="897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc" Apr 17 07:55:38.380630 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:38.380595 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc"} err="failed to get container status \"897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc\": rpc error: code = NotFound desc = could not find container \"897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc\": container with ID starting with 897123d532425b0562eed80172dfac3efd654ee01163ef387d11b68dac3302bc not found: ID does not exist" Apr 17 07:55:38.741798 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:38.741712 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b6bb9c595-qc478" event={"ID":"d1f86369-8a7f-4a7c-a739-a83856871552","Type":"ContainerStarted","Data":"1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7"} Apr 17 07:55:38.757736 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:38.757690 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b6bb9c595-qc478" podStartSLOduration=1.19248658 podStartE2EDuration="3.757674901s" podCreationTimestamp="2026-04-17 07:55:35 +0000 UTC" firstStartedPulling="2026-04-17 07:55:35.864656754 +0000 UTC m=+194.351778536" lastFinishedPulling="2026-04-17 07:55:38.429845076 +0000 UTC m=+196.916966857" observedRunningTime="2026-04-17 07:55:38.756283419 +0000 UTC m=+197.243405224" watchObservedRunningTime="2026-04-17 07:55:38.757674901 +0000 UTC m=+197.244796704" Apr 17 07:55:45.741733 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:45.741688 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:45.741733 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:45.741741 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:45.746509 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:45.746486 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:45.764633 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:45.764605 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:55:55.572545 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:55:55.572509 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b6bb9c595-qc478"] Apr 17 07:56:20.593862 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.593805 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b6bb9c595-qc478" podUID="d1f86369-8a7f-4a7c-a739-a83856871552" containerName="console" containerID="cri-o://1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7" gracePeriod=15 Apr 17 07:56:20.824703 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.824678 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b6bb9c595-qc478_d1f86369-8a7f-4a7c-a739-a83856871552/console/0.log" Apr 17 07:56:20.824852 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.824751 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:56:20.843353 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843323 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-console-config\") pod \"d1f86369-8a7f-4a7c-a739-a83856871552\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " Apr 17 07:56:20.843502 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843365 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-oauth-config\") pod \"d1f86369-8a7f-4a7c-a739-a83856871552\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " Apr 17 07:56:20.843502 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843396 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-serving-cert\") pod \"d1f86369-8a7f-4a7c-a739-a83856871552\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " Apr 17 07:56:20.843502 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843437 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvcxn\" (UniqueName: \"kubernetes.io/projected/d1f86369-8a7f-4a7c-a739-a83856871552-kube-api-access-jvcxn\") pod \"d1f86369-8a7f-4a7c-a739-a83856871552\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " Apr 17 07:56:20.843502 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843485 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-service-ca\") pod \"d1f86369-8a7f-4a7c-a739-a83856871552\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " Apr 17 07:56:20.843705 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843539 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-oauth-serving-cert\") pod \"d1f86369-8a7f-4a7c-a739-a83856871552\" (UID: \"d1f86369-8a7f-4a7c-a739-a83856871552\") " Apr 17 07:56:20.843908 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843842 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-console-config" (OuterVolumeSpecName: "console-config") pod "d1f86369-8a7f-4a7c-a739-a83856871552" (UID: "d1f86369-8a7f-4a7c-a739-a83856871552"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:20.844029 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.843953 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-service-ca" (OuterVolumeSpecName: "service-ca") pod "d1f86369-8a7f-4a7c-a739-a83856871552" (UID: "d1f86369-8a7f-4a7c-a739-a83856871552"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:20.844098 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.844040 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d1f86369-8a7f-4a7c-a739-a83856871552" (UID: "d1f86369-8a7f-4a7c-a739-a83856871552"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:20.846036 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.846002 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d1f86369-8a7f-4a7c-a739-a83856871552" (UID: "d1f86369-8a7f-4a7c-a739-a83856871552"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:20.846265 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.846245 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d1f86369-8a7f-4a7c-a739-a83856871552" (UID: "d1f86369-8a7f-4a7c-a739-a83856871552"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:20.846367 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.846345 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f86369-8a7f-4a7c-a739-a83856871552-kube-api-access-jvcxn" (OuterVolumeSpecName: "kube-api-access-jvcxn") pod "d1f86369-8a7f-4a7c-a739-a83856871552" (UID: "d1f86369-8a7f-4a7c-a739-a83856871552"). InnerVolumeSpecName "kube-api-access-jvcxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:20.855332 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.855311 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b6bb9c595-qc478_d1f86369-8a7f-4a7c-a739-a83856871552/console/0.log" Apr 17 07:56:20.855418 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.855360 2560 generic.go:358] "Generic (PLEG): container finished" podID="d1f86369-8a7f-4a7c-a739-a83856871552" containerID="1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7" exitCode=2 Apr 17 07:56:20.855418 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.855400 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b6bb9c595-qc478" event={"ID":"d1f86369-8a7f-4a7c-a739-a83856871552","Type":"ContainerDied","Data":"1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7"} Apr 17 07:56:20.855499 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.855432 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b6bb9c595-qc478" Apr 17 07:56:20.855499 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.855450 2560 scope.go:117] "RemoveContainer" containerID="1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7" Apr 17 07:56:20.855578 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.855437 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b6bb9c595-qc478" event={"ID":"d1f86369-8a7f-4a7c-a739-a83856871552","Type":"ContainerDied","Data":"88fe811236efd1860964fba273ee442d812546c587c57149b23ff617ed761239"} Apr 17 07:56:20.863865 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.863847 2560 scope.go:117] "RemoveContainer" containerID="1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7" Apr 17 07:56:20.864149 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:20.864128 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7\": container with ID starting with 1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7 not found: ID does not exist" containerID="1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7" Apr 17 07:56:20.864222 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.864156 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7"} err="failed to get container status \"1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7\": rpc error: code = NotFound desc = could not find container \"1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7\": container with ID starting with 1cfaff3058fd7585ddfe6a4620b73e05aee420adfaefe0204821241e9b3764e7 not found: ID does not exist" Apr 17 07:56:20.875614 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.875579 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b6bb9c595-qc478"] Apr 17 07:56:20.878631 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.878607 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b6bb9c595-qc478"] Apr 17 07:56:20.944238 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.944203 2560 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-console-config\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:20.944238 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.944237 2560 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-oauth-config\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:20.944408 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.944249 2560 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f86369-8a7f-4a7c-a739-a83856871552-console-serving-cert\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:20.944408 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.944259 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvcxn\" (UniqueName: \"kubernetes.io/projected/d1f86369-8a7f-4a7c-a739-a83856871552-kube-api-access-jvcxn\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:20.944408 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.944269 2560 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-service-ca\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:20.944408 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:20.944279 2560 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1f86369-8a7f-4a7c-a739-a83856871552-oauth-serving-cert\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:22.133416 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:22.133381 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f86369-8a7f-4a7c-a739-a83856871552" path="/var/lib/kubelet/pods/d1f86369-8a7f-4a7c-a739-a83856871552/volumes" Apr 17 07:56:33.850088 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:33.850048 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:56:33.852331 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:33.852306 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca1d48-95c2-414b-af4e-838843029028-metrics-certs\") pod \"network-metrics-daemon-k4vcb\" (UID: \"f6ca1d48-95c2-414b-af4e-838843029028\") " pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:56:34.131460 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:34.131386 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jkb85\"" Apr 17 07:56:34.140204 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:34.140178 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k4vcb" Apr 17 07:56:34.253672 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:34.253636 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k4vcb"] Apr 17 07:56:34.257885 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:56:34.257858 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ca1d48_95c2_414b_af4e_838843029028.slice/crio-af96c0b61f96009a87d799d2040793d061b26c1ca279bedd375f1730965310d6 WatchSource:0}: Error finding container af96c0b61f96009a87d799d2040793d061b26c1ca279bedd375f1730965310d6: Status 404 returned error can't find the container with id af96c0b61f96009a87d799d2040793d061b26c1ca279bedd375f1730965310d6 Apr 17 07:56:34.896738 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:34.896702 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k4vcb" event={"ID":"f6ca1d48-95c2-414b-af4e-838843029028","Type":"ContainerStarted","Data":"af96c0b61f96009a87d799d2040793d061b26c1ca279bedd375f1730965310d6"} Apr 17 07:56:35.900903 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:35.900870 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k4vcb" event={"ID":"f6ca1d48-95c2-414b-af4e-838843029028","Type":"ContainerStarted","Data":"457c67101a8fca520fbc8d2d2752dbed8aa1a60d3424e1a6db6db82a10c9791c"} Apr 17 07:56:35.900903 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:35.900905 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k4vcb" event={"ID":"f6ca1d48-95c2-414b-af4e-838843029028","Type":"ContainerStarted","Data":"c701d168e7dc0edd727c3a1f5e0241805d599ed94b3a52dd7c50f9ff8fe212a7"} Apr 17 07:56:35.919917 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:35.919836 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k4vcb" podStartSLOduration=253.010894382 podStartE2EDuration="4m13.919816265s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:56:34.25993157 +0000 UTC m=+252.747053352" lastFinishedPulling="2026-04-17 07:56:35.168853455 +0000 UTC m=+253.655975235" observedRunningTime="2026-04-17 07:56:35.918664574 +0000 UTC m=+254.405786377" watchObservedRunningTime="2026-04-17 07:56:35.919816265 +0000 UTC m=+254.406938070" Apr 17 07:56:45.664882 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.664803 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:45.665444 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.665279 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="alertmanager" containerID="cri-o://0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048" gracePeriod=120 Apr 17 07:56:45.665444 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.665341 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-metric" containerID="cri-o://872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e" gracePeriod=120 Apr 17 07:56:45.665444 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.665368 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-web" containerID="cri-o://f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df" gracePeriod=120 Apr 17 07:56:45.665655 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.665432 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy" containerID="cri-o://7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df" gracePeriod=120 Apr 17 07:56:45.666223 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.665442 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="prom-label-proxy" containerID="cri-o://b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb" gracePeriod=120 Apr 17 07:56:45.666223 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.665910 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="config-reloader" containerID="cri-o://cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f" gracePeriod=120 Apr 17 07:56:45.932730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932646 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb" exitCode=0 Apr 17 07:56:45.932730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932670 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df" exitCode=0 Apr 17 07:56:45.932730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932677 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f" exitCode=0 Apr 17 07:56:45.932730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932683 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048" exitCode=0 Apr 17 07:56:45.933021 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932725 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb"} Apr 17 07:56:45.933021 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932769 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df"} Apr 17 07:56:45.933021 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932779 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f"} Apr 17 07:56:45.933021 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:45.932788 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048"} Apr 17 07:56:46.906629 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.906607 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:46.940044 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940003 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e" exitCode=0 Apr 17 07:56:46.940044 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940038 2560 generic.go:358] "Generic (PLEG): container finished" podID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerID="f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df" exitCode=0 Apr 17 07:56:46.940272 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940102 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e"} Apr 17 07:56:46.940272 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940136 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:46.940272 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940147 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df"} Apr 17 07:56:46.940272 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940165 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec","Type":"ContainerDied","Data":"4359ca0356bdcecdd11d2147e57c0bbbb788fbb4ccd6eaf3efc513a4ab8faf47"} Apr 17 07:56:46.940272 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.940188 2560 scope.go:117] "RemoveContainer" containerID="b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb" Apr 17 07:56:46.950858 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.950833 2560 scope.go:117] "RemoveContainer" containerID="872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e" Apr 17 07:56:46.953070 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953048 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-out\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953168 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953097 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953168 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953144 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-web-config\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953276 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953171 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-metrics-client-ca\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953276 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953200 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953276 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953236 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-main-db\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953276 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953265 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953298 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-tls-assets\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953323 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpfgf\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-kube-api-access-zpfgf\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953355 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-volume\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953386 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953410 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.953469 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.953454 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-cluster-tls-config\") pod \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\" (UID: \"7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec\") " Apr 17 07:56:46.954823 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.954651 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:46.955256 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.954974 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:46.955256 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.955053 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:46.956004 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.955961 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.956520 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.956498 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-volume" (OuterVolumeSpecName: "config-volume") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.956636 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.956585 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-out" (OuterVolumeSpecName: "config-out") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:46.957405 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.957372 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.957804 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.957773 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.958257 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.958216 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-kube-api-access-zpfgf" (OuterVolumeSpecName: "kube-api-access-zpfgf") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "kube-api-access-zpfgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:46.958524 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.958494 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.958798 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.958781 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:46.963356 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.963321 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.968530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.968502 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-web-config" (OuterVolumeSpecName: "web-config") pod "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" (UID: "7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:46.968831 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.968814 2560 scope.go:117] "RemoveContainer" containerID="7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df" Apr 17 07:56:46.975969 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.975951 2560 scope.go:117] "RemoveContainer" containerID="f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df" Apr 17 07:56:46.982346 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.982318 2560 scope.go:117] "RemoveContainer" containerID="cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f" Apr 17 07:56:46.988687 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.988671 2560 scope.go:117] "RemoveContainer" containerID="0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048" Apr 17 07:56:46.994866 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:46.994848 2560 scope.go:117] "RemoveContainer" containerID="33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242" Apr 17 07:56:47.000914 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.000895 2560 scope.go:117] "RemoveContainer" containerID="b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb" Apr 17 07:56:47.001200 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.001176 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb\": container with ID starting with b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb not found: ID does not exist" containerID="b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb" Apr 17 07:56:47.001284 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001207 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb"} err="failed to get container status \"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb\": rpc error: code = NotFound desc = could not find container \"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb\": container with ID starting with b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb not found: ID does not exist" Apr 17 07:56:47.001284 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001225 2560 scope.go:117] "RemoveContainer" containerID="872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e" Apr 17 07:56:47.001465 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.001447 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e\": container with ID starting with 872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e not found: ID does not exist" containerID="872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e" Apr 17 07:56:47.001512 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001475 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e"} err="failed to get container status \"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e\": rpc error: code = NotFound desc = could not find container \"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e\": container with ID starting with 872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e not found: ID does not exist" Apr 17 07:56:47.001512 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001492 2560 scope.go:117] "RemoveContainer" containerID="7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df" Apr 17 07:56:47.001701 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.001683 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df\": container with ID starting with 7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df not found: ID does not exist" containerID="7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df" Apr 17 07:56:47.001762 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001709 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df"} err="failed to get container status \"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df\": rpc error: code = NotFound desc = could not find container \"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df\": container with ID starting with 7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df not found: ID does not exist" Apr 17 07:56:47.001762 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001729 2560 scope.go:117] "RemoveContainer" containerID="f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df" Apr 17 07:56:47.001959 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.001944 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df\": container with ID starting with f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df not found: ID does not exist" containerID="f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df" Apr 17 07:56:47.002092 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001963 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df"} err="failed to get container status \"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df\": rpc error: code = NotFound desc = could not find container \"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df\": container with ID starting with f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df not found: ID does not exist" Apr 17 07:56:47.002092 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.001976 2560 scope.go:117] "RemoveContainer" containerID="cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f" Apr 17 07:56:47.002215 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.002198 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f\": container with ID starting with cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f not found: ID does not exist" containerID="cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f" Apr 17 07:56:47.002253 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002220 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f"} err="failed to get container status \"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f\": rpc error: code = NotFound desc = could not find container \"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f\": container with ID starting with cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f not found: ID does not exist" Apr 17 07:56:47.002253 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002242 2560 scope.go:117] "RemoveContainer" containerID="0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048" Apr 17 07:56:47.002465 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.002448 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048\": container with ID starting with 0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048 not found: ID does not exist" containerID="0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048" Apr 17 07:56:47.002507 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002468 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048"} err="failed to get container status \"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048\": rpc error: code = NotFound desc = could not find container \"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048\": container with ID starting with 0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048 not found: ID does not exist" Apr 17 07:56:47.002507 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002483 2560 scope.go:117] "RemoveContainer" containerID="33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242" Apr 17 07:56:47.002656 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:56:47.002641 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242\": container with ID starting with 33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242 not found: ID does not exist" containerID="33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242" Apr 17 07:56:47.002692 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002657 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242"} err="failed to get container status \"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242\": rpc error: code = NotFound desc = could not find container \"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242\": container with ID starting with 33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242 not found: ID does not exist" Apr 17 07:56:47.002692 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002669 2560 scope.go:117] "RemoveContainer" containerID="b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb" Apr 17 07:56:47.002819 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002805 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb"} err="failed to get container status \"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb\": rpc error: code = NotFound desc = could not find container \"b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb\": container with ID starting with b0eb512b52cb1b8e56bcb2e9335db4dbc837c0aed513b91307f980292ea5b0fb not found: ID does not exist" Apr 17 07:56:47.002863 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002819 2560 scope.go:117] "RemoveContainer" containerID="872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e" Apr 17 07:56:47.002980 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002966 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e"} err="failed to get container status \"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e\": rpc error: code = NotFound desc = could not find container \"872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e\": container with ID starting with 872b44dacb0ef3f35fc15386b2639db0f70234ebbb76c518374f24e28167af7e not found: ID does not exist" Apr 17 07:56:47.003046 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.002979 2560 scope.go:117] "RemoveContainer" containerID="7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df" Apr 17 07:56:47.003206 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003188 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df"} err="failed to get container status \"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df\": rpc error: code = NotFound desc = could not find container \"7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df\": container with ID starting with 7503d71b84e2312597c512ceb53aeadf551cdc68d16f64be5a6357c81bda70df not found: ID does not exist" Apr 17 07:56:47.003276 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003208 2560 scope.go:117] "RemoveContainer" containerID="f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df" Apr 17 07:56:47.003434 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003416 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df"} err="failed to get container status \"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df\": rpc error: code = NotFound desc = could not find container \"f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df\": container with ID starting with f7be6dba7c6204e6f0752a934d090c8b774b1665ea38ac3c431907cf95a188df not found: ID does not exist" Apr 17 07:56:47.003479 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003435 2560 scope.go:117] "RemoveContainer" containerID="cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f" Apr 17 07:56:47.003645 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003627 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f"} err="failed to get container status \"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f\": rpc error: code = NotFound desc = could not find container \"cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f\": container with ID starting with cd599641275a6b34b60fc0a0cacd0464b38c66ab3d1818955499b6d0782c5e6f not found: ID does not exist" Apr 17 07:56:47.003689 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003645 2560 scope.go:117] "RemoveContainer" containerID="0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048" Apr 17 07:56:47.003832 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003814 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048"} err="failed to get container status \"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048\": rpc error: code = NotFound desc = could not find container \"0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048\": container with ID starting with 0c8a748d79fd17c6c773588a41168a7ce1ecd8248a97608ffeed161b526d3048 not found: ID does not exist" Apr 17 07:56:47.003898 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.003833 2560 scope.go:117] "RemoveContainer" containerID="33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242" Apr 17 07:56:47.004053 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.004031 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242"} err="failed to get container status \"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242\": rpc error: code = NotFound desc = could not find container \"33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242\": container with ID starting with 33d269218ebf28c6238bc33230948522cbeaee710313654f81e4335a85ac6242 not found: ID does not exist" Apr 17 07:56:47.054585 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054551 2560 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-out\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054585 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054581 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054585 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054592 2560 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-web-config\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054602 2560 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-metrics-client-ca\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054611 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054620 2560 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-main-db\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054629 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054639 2560 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-tls-assets\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054648 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpfgf\" (UniqueName: \"kubernetes.io/projected/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-kube-api-access-zpfgf\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054658 2560 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-config-volume\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054666 2560 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054675 2560 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-secret-alertmanager-main-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.054800 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.054686 2560 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec-cluster-tls-config\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:56:47.262970 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.262940 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:47.266690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.266668 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:47.290861 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.290827 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:47.291186 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291169 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="config-reloader" Apr 17 07:56:47.291234 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291191 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="config-reloader" Apr 17 07:56:47.291234 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291204 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy" Apr 17 07:56:47.291234 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291213 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy" Apr 17 07:56:47.291234 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291223 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="alertmanager" Apr 17 07:56:47.291234 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291232 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="alertmanager" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291240 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfacbb4c-e321-492b-9f6c-f223c66aba6e" containerName="registry" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291248 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfacbb4c-e321-492b-9f6c-f223c66aba6e" containerName="registry" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291266 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-metric" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291275 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-metric" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291287 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-web" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291295 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-web" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291304 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="prom-label-proxy" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291313 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="prom-label-proxy" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291325 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="init-config-reloader" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291333 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="init-config-reloader" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291342 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1f86369-8a7f-4a7c-a739-a83856871552" containerName="console" Apr 17 07:56:47.291371 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291350 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f86369-8a7f-4a7c-a739-a83856871552" containerName="console" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291408 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="prom-label-proxy" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291420 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1f86369-8a7f-4a7c-a739-a83856871552" containerName="console" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291430 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfacbb4c-e321-492b-9f6c-f223c66aba6e" containerName="registry" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291440 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="config-reloader" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291452 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="alertmanager" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291462 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-web" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291472 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy" Apr 17 07:56:47.291744 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.291482 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" containerName="kube-rbac-proxy-metric" Apr 17 07:56:47.297010 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.296967 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.300016 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.299929 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 07:56:47.300016 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.299953 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 07:56:47.300016 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.299963 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 07:56:47.300235 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.300019 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 07:56:47.300235 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.299932 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-w46vb\"" Apr 17 07:56:47.300399 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.300384 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 07:56:47.300473 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.300402 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 07:56:47.300564 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.300549 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 07:56:47.300655 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.300625 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 07:56:47.305322 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.305303 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 07:56:47.309497 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.309473 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:47.357236 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357191 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3b69634-5c7f-46df-9cfe-1c3746d89b86-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357236 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357231 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc57t\" (UniqueName: \"kubernetes.io/projected/f3b69634-5c7f-46df-9cfe-1c3746d89b86-kube-api-access-gc57t\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357253 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357335 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-web-config\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357371 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357392 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357425 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3b69634-5c7f-46df-9cfe-1c3746d89b86-config-out\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357576 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357496 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b69634-5c7f-46df-9cfe-1c3746d89b86-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357576 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357543 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357576 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357571 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3b69634-5c7f-46df-9cfe-1c3746d89b86-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357669 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357615 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-config-volume\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357669 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357635 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3b69634-5c7f-46df-9cfe-1c3746d89b86-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.357669 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.357651 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.458954 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.458914 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-config-volume\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459072 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.458960 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3b69634-5c7f-46df-9cfe-1c3746d89b86-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459072 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459007 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459072 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459041 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3b69634-5c7f-46df-9cfe-1c3746d89b86-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459179 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459155 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc57t\" (UniqueName: \"kubernetes.io/projected/f3b69634-5c7f-46df-9cfe-1c3746d89b86-kube-api-access-gc57t\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459230 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459210 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459266 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459255 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-web-config\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459311 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459289 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459359 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459321 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459410 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459363 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3b69634-5c7f-46df-9cfe-1c3746d89b86-config-out\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459410 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459388 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b69634-5c7f-46df-9cfe-1c3746d89b86-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459511 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459437 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459511 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459457 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3b69634-5c7f-46df-9cfe-1c3746d89b86-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459511 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459465 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3b69634-5c7f-46df-9cfe-1c3746d89b86-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.459808 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.459776 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3b69634-5c7f-46df-9cfe-1c3746d89b86-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.460760 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.460495 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b69634-5c7f-46df-9cfe-1c3746d89b86-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462395 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462193 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-config-volume\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462395 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462326 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462568 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462402 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462568 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462424 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462568 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462447 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3b69634-5c7f-46df-9cfe-1c3746d89b86-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462568 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462450 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-web-config\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462783 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462762 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3b69634-5c7f-46df-9cfe-1c3746d89b86-config-out\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.462869 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.462850 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.463997 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.463969 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f3b69634-5c7f-46df-9cfe-1c3746d89b86-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.466462 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.466444 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc57t\" (UniqueName: \"kubernetes.io/projected/f3b69634-5c7f-46df-9cfe-1c3746d89b86-kube-api-access-gc57t\") pod \"alertmanager-main-0\" (UID: \"f3b69634-5c7f-46df-9cfe-1c3746d89b86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.605748 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.605709 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:47.730749 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.730726 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:47.733656 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:56:47.733624 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b69634_5c7f_46df_9cfe_1c3746d89b86.slice/crio-b7ee1bc98a30e96ff4542b5d33cd20e0e121315a4c292435257816c8d9b7839e WatchSource:0}: Error finding container b7ee1bc98a30e96ff4542b5d33cd20e0e121315a4c292435257816c8d9b7839e: Status 404 returned error can't find the container with id b7ee1bc98a30e96ff4542b5d33cd20e0e121315a4c292435257816c8d9b7839e Apr 17 07:56:47.945604 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.945519 2560 generic.go:358] "Generic (PLEG): container finished" podID="f3b69634-5c7f-46df-9cfe-1c3746d89b86" containerID="4da6c99c73fb8c89839e1a8305031d61e7d74ca3baaaa4aca1a0c58d2b320503" exitCode=0 Apr 17 07:56:47.945604 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.945579 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerDied","Data":"4da6c99c73fb8c89839e1a8305031d61e7d74ca3baaaa4aca1a0c58d2b320503"} Apr 17 07:56:47.946023 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:47.945606 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"b7ee1bc98a30e96ff4542b5d33cd20e0e121315a4c292435257816c8d9b7839e"} Apr 17 07:56:48.132809 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.132769 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec" path="/var/lib/kubelet/pods/7309d7e6-66ac-41d9-b7ce-d64b60b8e5ec/volumes" Apr 17 07:56:48.952524 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.952487 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"659f783192e7fc9866c3d2e22716cc714956c6097ca8e2c15e52ced48998cb7c"} Apr 17 07:56:48.952524 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.952525 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"fe5c223fe100fc73beafcae3bbc2625f1c9e092006ba31831f9970c69a72df6b"} Apr 17 07:56:48.952918 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.952535 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"6ef6dbbf1b14806e5a0e7aa3323b45b0e45626b65378b568acf74a38ec022aff"} Apr 17 07:56:48.952918 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.952544 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"8268777a1b2d97d2090bdafd349b18550dcf3967991e636a90426a65ee2eeb93"} Apr 17 07:56:48.952918 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.952553 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"4f720eae1bb161e6a02ac4875c5e9fbb0bf5f9c31f419318b35968186b104f40"} Apr 17 07:56:48.952918 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.952561 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3b69634-5c7f-46df-9cfe-1c3746d89b86","Type":"ContainerStarted","Data":"147f88745bd9e334b72a17ebada1467a5aca873e2989c0fd549f18d52c5b1dfd"} Apr 17 07:56:48.979725 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:48.979671 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.979656622 podStartE2EDuration="1.979656622s" podCreationTimestamp="2026-04-17 07:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:48.978097103 +0000 UTC m=+267.465218905" watchObservedRunningTime="2026-04-17 07:56:48.979656622 +0000 UTC m=+267.466778425" Apr 17 07:56:49.692843 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.692800 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-586dd88f6-tz2xg"] Apr 17 07:56:49.695901 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.695879 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.698334 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.698313 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7hng8\"" Apr 17 07:56:49.698430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.698315 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 07:56:49.698430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.698315 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 07:56:49.699104 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.699091 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 07:56:49.699608 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.699591 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 07:56:49.699701 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.699662 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 07:56:49.705255 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.705236 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 07:56:49.709455 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.709432 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-586dd88f6-tz2xg"] Apr 17 07:56:49.778395 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778364 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-serving-certs-ca-bundle\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778557 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778410 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778557 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778531 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-telemeter-client-tls\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778634 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778569 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-federate-client-tls\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778634 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778603 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778702 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778650 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-secret-telemeter-client\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778738 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778704 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-metrics-client-ca\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.778768 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.778738 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b98jp\" (UniqueName: \"kubernetes.io/projected/aa20a5ef-7210-40b0-9d12-74a82d28d52a-kube-api-access-b98jp\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.879551 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879512 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-telemeter-client-tls\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.879551 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879553 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-federate-client-tls\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.879790 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879575 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.879790 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879600 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-secret-telemeter-client\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.879790 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879625 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-metrics-client-ca\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.879790 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879775 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b98jp\" (UniqueName: \"kubernetes.io/projected/aa20a5ef-7210-40b0-9d12-74a82d28d52a-kube-api-access-b98jp\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.880024 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879807 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-serving-certs-ca-bundle\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.880024 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.879850 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.880365 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.880344 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-metrics-client-ca\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.880551 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.880528 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-serving-certs-ca-bundle\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.880762 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.880746 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa20a5ef-7210-40b0-9d12-74a82d28d52a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.882679 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.882649 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-secret-telemeter-client\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.882755 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.882741 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.882794 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.882760 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-telemeter-client-tls\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.883137 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.883119 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/aa20a5ef-7210-40b0-9d12-74a82d28d52a-federate-client-tls\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:49.887964 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:49.887937 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b98jp\" (UniqueName: \"kubernetes.io/projected/aa20a5ef-7210-40b0-9d12-74a82d28d52a-kube-api-access-b98jp\") pod \"telemeter-client-586dd88f6-tz2xg\" (UID: \"aa20a5ef-7210-40b0-9d12-74a82d28d52a\") " pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:50.005415 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:50.005329 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" Apr 17 07:56:50.127084 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:50.127050 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-586dd88f6-tz2xg"] Apr 17 07:56:50.130362 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:56:50.130327 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa20a5ef_7210_40b0_9d12_74a82d28d52a.slice/crio-1930903e2f3172266a3fc49b4a1cafb98d61f54d92e7531e1b92c8f99186b5cd WatchSource:0}: Error finding container 1930903e2f3172266a3fc49b4a1cafb98d61f54d92e7531e1b92c8f99186b5cd: Status 404 returned error can't find the container with id 1930903e2f3172266a3fc49b4a1cafb98d61f54d92e7531e1b92c8f99186b5cd Apr 17 07:56:50.961474 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:50.961437 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" event={"ID":"aa20a5ef-7210-40b0-9d12-74a82d28d52a","Type":"ContainerStarted","Data":"1930903e2f3172266a3fc49b4a1cafb98d61f54d92e7531e1b92c8f99186b5cd"} Apr 17 07:56:52.969047 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:52.969010 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" event={"ID":"aa20a5ef-7210-40b0-9d12-74a82d28d52a","Type":"ContainerStarted","Data":"0955066b6ee7de75330ccb9f3c97284d6f796d7425776162bd297d03e4a3d020"} Apr 17 07:56:52.969047 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:52.969049 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" event={"ID":"aa20a5ef-7210-40b0-9d12-74a82d28d52a","Type":"ContainerStarted","Data":"ded5948114f4ca91fb45ea0f80e4f05e7462f5359f6858450ce869167f8d5413"} Apr 17 07:56:52.969473 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:52.969060 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" event={"ID":"aa20a5ef-7210-40b0-9d12-74a82d28d52a","Type":"ContainerStarted","Data":"100590bad54996fca1cec91991e053a1332bf5f2f027980ff93d5631ff78ad55"} Apr 17 07:56:52.991897 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:52.991844 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-586dd88f6-tz2xg" podStartSLOduration=2.185728223 podStartE2EDuration="3.991828059s" podCreationTimestamp="2026-04-17 07:56:49 +0000 UTC" firstStartedPulling="2026-04-17 07:56:50.132227563 +0000 UTC m=+268.619349345" lastFinishedPulling="2026-04-17 07:56:51.9383274 +0000 UTC m=+270.425449181" observedRunningTime="2026-04-17 07:56:52.990271602 +0000 UTC m=+271.477393407" watchObservedRunningTime="2026-04-17 07:56:52.991828059 +0000 UTC m=+271.478949861" Apr 17 07:56:53.662398 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.662364 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-689b84d64b-knfnb"] Apr 17 07:56:53.664911 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.664891 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.667323 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.667299 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j6lrp\"" Apr 17 07:56:53.668368 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668350 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:56:53.668518 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668375 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:56:53.668690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668394 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:56:53.668690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668409 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:56:53.668690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668418 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:56:53.668690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668436 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:56:53.668690 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.668448 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:56:53.672802 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.672785 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:56:53.677974 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.677949 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-689b84d64b-knfnb"] Apr 17 07:56:53.813596 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813554 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-oauth-config\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.813596 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813591 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-trusted-ca-bundle\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.813806 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813653 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-service-ca\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.813806 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813675 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-config\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.813806 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813748 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-oauth-serving-cert\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.813806 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813778 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnp7\" (UniqueName: \"kubernetes.io/projected/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-kube-api-access-nwnp7\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.813927 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.813812 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-serving-cert\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.914978 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-service-ca\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915094 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915052 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-config\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915318 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915096 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-oauth-serving-cert\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915318 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915123 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwnp7\" (UniqueName: \"kubernetes.io/projected/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-kube-api-access-nwnp7\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915318 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915162 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-serving-cert\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915318 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915190 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-oauth-config\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915318 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915211 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-trusted-ca-bundle\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915766 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915740 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-service-ca\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915855 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915802 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-oauth-serving-cert\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.915855 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.915802 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-config\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.916114 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.916092 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-trusted-ca-bundle\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.918600 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.918584 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-oauth-config\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.918813 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.918792 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-serving-cert\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.923227 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.923207 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwnp7\" (UniqueName: \"kubernetes.io/projected/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-kube-api-access-nwnp7\") pod \"console-689b84d64b-knfnb\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:53.974898 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:53.974871 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:56:54.094845 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:54.094812 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-689b84d64b-knfnb"] Apr 17 07:56:54.098221 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:56:54.098191 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f8f4a0_66f4_4544_9d9b_cdbb300bed69.slice/crio-48c9d0349ee530468d7c997cc4a64a0a18610641ebb69931c8d19f14f312d7ae WatchSource:0}: Error finding container 48c9d0349ee530468d7c997cc4a64a0a18610641ebb69931c8d19f14f312d7ae: Status 404 returned error can't find the container with id 48c9d0349ee530468d7c997cc4a64a0a18610641ebb69931c8d19f14f312d7ae Apr 17 07:56:54.977351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:54.977306 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b84d64b-knfnb" event={"ID":"44f8f4a0-66f4-4544-9d9b-cdbb300bed69","Type":"ContainerStarted","Data":"0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4"} Apr 17 07:56:54.977351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:54.977349 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b84d64b-knfnb" event={"ID":"44f8f4a0-66f4-4544-9d9b-cdbb300bed69","Type":"ContainerStarted","Data":"48c9d0349ee530468d7c997cc4a64a0a18610641ebb69931c8d19f14f312d7ae"} Apr 17 07:56:54.994975 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:56:54.994919 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-689b84d64b-knfnb" podStartSLOduration=1.99490275 podStartE2EDuration="1.99490275s" podCreationTimestamp="2026-04-17 07:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:54.992841416 +0000 UTC m=+273.479963219" watchObservedRunningTime="2026-04-17 07:56:54.99490275 +0000 UTC m=+273.482024553" Apr 17 07:57:03.975649 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:57:03.975591 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:57:03.975649 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:57:03.975652 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:57:03.980170 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:57:03.980148 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:57:04.002934 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:57:04.002906 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:57:21.997296 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:57:21.997269 2560 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:58:20.976171 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:20.976091 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-689b84d64b-knfnb"] Apr 17 07:58:35.145618 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.145581 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8d5w9"] Apr 17 07:58:35.149089 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.149065 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.151300 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.151279 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:58:35.153420 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.153396 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8d5w9"] Apr 17 07:58:35.205362 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.205323 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0cebed97-5a79-4daf-8b36-7fb10b919eaa-original-pull-secret\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.205362 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.205367 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0cebed97-5a79-4daf-8b36-7fb10b919eaa-kubelet-config\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.205570 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.205387 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0cebed97-5a79-4daf-8b36-7fb10b919eaa-dbus\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.306776 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.306737 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0cebed97-5a79-4daf-8b36-7fb10b919eaa-original-pull-secret\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.306776 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.306778 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0cebed97-5a79-4daf-8b36-7fb10b919eaa-kubelet-config\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.307006 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.306804 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0cebed97-5a79-4daf-8b36-7fb10b919eaa-dbus\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.307006 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.306885 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0cebed97-5a79-4daf-8b36-7fb10b919eaa-kubelet-config\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.307006 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.306945 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0cebed97-5a79-4daf-8b36-7fb10b919eaa-dbus\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.309153 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.309121 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0cebed97-5a79-4daf-8b36-7fb10b919eaa-original-pull-secret\") pod \"global-pull-secret-syncer-8d5w9\" (UID: \"0cebed97-5a79-4daf-8b36-7fb10b919eaa\") " pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.459753 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.459663 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8d5w9" Apr 17 07:58:35.575047 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.575010 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8d5w9"] Apr 17 07:58:35.577962 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:58:35.577922 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cebed97_5a79_4daf_8b36_7fb10b919eaa.slice/crio-dca1214be679757039082e8f380569b3a43ecc6bc82fdd9bc252f3a85cd3042b WatchSource:0}: Error finding container dca1214be679757039082e8f380569b3a43ecc6bc82fdd9bc252f3a85cd3042b: Status 404 returned error can't find the container with id dca1214be679757039082e8f380569b3a43ecc6bc82fdd9bc252f3a85cd3042b Apr 17 07:58:35.579587 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:35.579571 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:58:36.250296 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:36.250255 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8d5w9" event={"ID":"0cebed97-5a79-4daf-8b36-7fb10b919eaa","Type":"ContainerStarted","Data":"dca1214be679757039082e8f380569b3a43ecc6bc82fdd9bc252f3a85cd3042b"} Apr 17 07:58:40.263420 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:40.263380 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8d5w9" event={"ID":"0cebed97-5a79-4daf-8b36-7fb10b919eaa","Type":"ContainerStarted","Data":"221373e42b4ebd22f2f0d7a971d4ab806cecce948cecaf9324d35ab5f01ece5d"} Apr 17 07:58:40.277372 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:40.277318 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8d5w9" podStartSLOduration=1.393679675 podStartE2EDuration="5.27729994s" podCreationTimestamp="2026-04-17 07:58:35 +0000 UTC" firstStartedPulling="2026-04-17 07:58:35.579693244 +0000 UTC m=+374.066815025" lastFinishedPulling="2026-04-17 07:58:39.463313494 +0000 UTC m=+377.950435290" observedRunningTime="2026-04-17 07:58:40.276664077 +0000 UTC m=+378.763785893" watchObservedRunningTime="2026-04-17 07:58:40.27729994 +0000 UTC m=+378.764421744" Apr 17 07:58:45.998400 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:45.998362 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-689b84d64b-knfnb" podUID="44f8f4a0-66f4-4544-9d9b-cdbb300bed69" containerName="console" containerID="cri-o://0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4" gracePeriod=15 Apr 17 07:58:46.240595 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.240573 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689b84d64b-knfnb_44f8f4a0-66f4-4544-9d9b-cdbb300bed69/console/0.log" Apr 17 07:58:46.240719 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.240633 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:58:46.283082 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.283012 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-689b84d64b-knfnb_44f8f4a0-66f4-4544-9d9b-cdbb300bed69/console/0.log" Apr 17 07:58:46.283082 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.283048 2560 generic.go:358] "Generic (PLEG): container finished" podID="44f8f4a0-66f4-4544-9d9b-cdbb300bed69" containerID="0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4" exitCode=2 Apr 17 07:58:46.283303 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.283111 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-689b84d64b-knfnb" Apr 17 07:58:46.283303 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.283108 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b84d64b-knfnb" event={"ID":"44f8f4a0-66f4-4544-9d9b-cdbb300bed69","Type":"ContainerDied","Data":"0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4"} Apr 17 07:58:46.283303 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.283155 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-689b84d64b-knfnb" event={"ID":"44f8f4a0-66f4-4544-9d9b-cdbb300bed69","Type":"ContainerDied","Data":"48c9d0349ee530468d7c997cc4a64a0a18610641ebb69931c8d19f14f312d7ae"} Apr 17 07:58:46.283303 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.283175 2560 scope.go:117] "RemoveContainer" containerID="0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4" Apr 17 07:58:46.290626 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.290606 2560 scope.go:117] "RemoveContainer" containerID="0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4" Apr 17 07:58:46.290857 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:58:46.290838 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4\": container with ID starting with 0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4 not found: ID does not exist" containerID="0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4" Apr 17 07:58:46.290921 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.290866 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4"} err="failed to get container status \"0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4\": rpc error: code = NotFound desc = could not find container \"0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4\": container with ID starting with 0e566744557718f752fc7c0f73611d24f4a705a8357c0c7326431964e28094b4 not found: ID does not exist" Apr 17 07:58:46.296219 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296200 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-serving-cert\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296279 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296235 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-oauth-serving-cert\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296279 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296255 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-oauth-config\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296279 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296270 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-service-ca\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296424 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296397 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-trusted-ca-bundle\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296476 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296443 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwnp7\" (UniqueName: \"kubernetes.io/projected/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-kube-api-access-nwnp7\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296528 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296505 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-config\") pod \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\" (UID: \"44f8f4a0-66f4-4544-9d9b-cdbb300bed69\") " Apr 17 07:58:46.296665 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296639 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-service-ca" (OuterVolumeSpecName: "service-ca") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:58:46.296737 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296672 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:58:46.296863 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296810 2560 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-oauth-serving-cert\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.296863 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296837 2560 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-service-ca\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.297011 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296882 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:58:46.297011 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.296959 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-config" (OuterVolumeSpecName: "console-config") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:58:46.298528 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.298502 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:58:46.298591 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.298528 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-kube-api-access-nwnp7" (OuterVolumeSpecName: "kube-api-access-nwnp7") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "kube-api-access-nwnp7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:58:46.298591 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.298573 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "44f8f4a0-66f4-4544-9d9b-cdbb300bed69" (UID: "44f8f4a0-66f4-4544-9d9b-cdbb300bed69"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:58:46.398117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.398063 2560 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-config\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.398117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.398109 2560 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-serving-cert\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.398117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.398122 2560 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-console-oauth-config\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.398117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.398135 2560 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-trusted-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.398378 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.398147 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nwnp7\" (UniqueName: \"kubernetes.io/projected/44f8f4a0-66f4-4544-9d9b-cdbb300bed69-kube-api-access-nwnp7\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:58:46.621592 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.621557 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-689b84d64b-knfnb"] Apr 17 07:58:46.630062 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:46.630034 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-689b84d64b-knfnb"] Apr 17 07:58:48.132753 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:48.132718 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f8f4a0-66f4-4544-9d9b-cdbb300bed69" path="/var/lib/kubelet/pods/44f8f4a0-66f4-4544-9d9b-cdbb300bed69/volumes" Apr 17 07:58:54.432748 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.432714 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk"] Apr 17 07:58:54.433194 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.433001 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44f8f4a0-66f4-4544-9d9b-cdbb300bed69" containerName="console" Apr 17 07:58:54.433194 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.433017 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f8f4a0-66f4-4544-9d9b-cdbb300bed69" containerName="console" Apr 17 07:58:54.433194 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.433079 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="44f8f4a0-66f4-4544-9d9b-cdbb300bed69" containerName="console" Apr 17 07:58:54.437651 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.437634 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.443107 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.443079 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jpkv2\"" Apr 17 07:58:54.443220 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.443087 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 07:58:54.443220 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.443128 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 07:58:54.452911 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.452883 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk"] Apr 17 07:58:54.561867 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.561826 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwjrf\" (UniqueName: \"kubernetes.io/projected/c3a16a88-42b0-4b3f-876f-81950f78a844-kube-api-access-hwjrf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.562100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.561879 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.562100 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.562004 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.662901 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.662860 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwjrf\" (UniqueName: \"kubernetes.io/projected/c3a16a88-42b0-4b3f-876f-81950f78a844-kube-api-access-hwjrf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.663117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.662914 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.663117 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.662956 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.663347 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.663323 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.663424 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.663365 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.670916 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.670894 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwjrf\" (UniqueName: \"kubernetes.io/projected/c3a16a88-42b0-4b3f-876f-81950f78a844-kube-api-access-hwjrf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.746642 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.746544 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:58:54.864874 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:54.864850 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk"] Apr 17 07:58:54.867635 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:58:54.867606 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3a16a88_42b0_4b3f_876f_81950f78a844.slice/crio-1bdb4ccf9b7f6e51b3c19ba4af143cd51db6158eca4a7346e2d53e356220ae36 WatchSource:0}: Error finding container 1bdb4ccf9b7f6e51b3c19ba4af143cd51db6158eca4a7346e2d53e356220ae36: Status 404 returned error can't find the container with id 1bdb4ccf9b7f6e51b3c19ba4af143cd51db6158eca4a7346e2d53e356220ae36 Apr 17 07:58:55.310977 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:58:55.310938 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" event={"ID":"c3a16a88-42b0-4b3f-876f-81950f78a844","Type":"ContainerStarted","Data":"1bdb4ccf9b7f6e51b3c19ba4af143cd51db6158eca4a7346e2d53e356220ae36"} Apr 17 07:59:00.328713 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:00.328680 2560 generic.go:358] "Generic (PLEG): container finished" podID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerID="384e90d08030aa74598a4bcde659c95d59c7f0f6d797b8f84b4e58da0ee0b5d3" exitCode=0 Apr 17 07:59:00.329108 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:00.328769 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" event={"ID":"c3a16a88-42b0-4b3f-876f-81950f78a844","Type":"ContainerDied","Data":"384e90d08030aa74598a4bcde659c95d59c7f0f6d797b8f84b4e58da0ee0b5d3"} Apr 17 07:59:03.340743 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:03.340704 2560 generic.go:358] "Generic (PLEG): container finished" podID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerID="5e4e14ffaf8a27392facd1f498b4b14ec6a693021e6f3cc2f8f3da229e5a3c17" exitCode=0 Apr 17 07:59:03.341232 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:03.340766 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" event={"ID":"c3a16a88-42b0-4b3f-876f-81950f78a844","Type":"ContainerDied","Data":"5e4e14ffaf8a27392facd1f498b4b14ec6a693021e6f3cc2f8f3da229e5a3c17"} Apr 17 07:59:10.363345 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:10.363315 2560 generic.go:358] "Generic (PLEG): container finished" podID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerID="3d812e570518744dd4493f92b3e0216bd6255888089704ead1e384d28474b144" exitCode=0 Apr 17 07:59:10.363717 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:10.363390 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" event={"ID":"c3a16a88-42b0-4b3f-876f-81950f78a844","Type":"ContainerDied","Data":"3d812e570518744dd4493f92b3e0216bd6255888089704ead1e384d28474b144"} Apr 17 07:59:11.480060 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.480034 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:59:11.606556 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.606518 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-util\") pod \"c3a16a88-42b0-4b3f-876f-81950f78a844\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " Apr 17 07:59:11.606730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.606595 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-bundle\") pod \"c3a16a88-42b0-4b3f-876f-81950f78a844\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " Apr 17 07:59:11.606730 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.606645 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwjrf\" (UniqueName: \"kubernetes.io/projected/c3a16a88-42b0-4b3f-876f-81950f78a844-kube-api-access-hwjrf\") pod \"c3a16a88-42b0-4b3f-876f-81950f78a844\" (UID: \"c3a16a88-42b0-4b3f-876f-81950f78a844\") " Apr 17 07:59:11.607279 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.607244 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-bundle" (OuterVolumeSpecName: "bundle") pod "c3a16a88-42b0-4b3f-876f-81950f78a844" (UID: "c3a16a88-42b0-4b3f-876f-81950f78a844"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:59:11.608779 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.608752 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a16a88-42b0-4b3f-876f-81950f78a844-kube-api-access-hwjrf" (OuterVolumeSpecName: "kube-api-access-hwjrf") pod "c3a16a88-42b0-4b3f-876f-81950f78a844" (UID: "c3a16a88-42b0-4b3f-876f-81950f78a844"). InnerVolumeSpecName "kube-api-access-hwjrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:59:11.612566 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.612534 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-util" (OuterVolumeSpecName: "util") pod "c3a16a88-42b0-4b3f-876f-81950f78a844" (UID: "c3a16a88-42b0-4b3f-876f-81950f78a844"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:59:11.707336 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.707254 2560 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-util\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:59:11.707336 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.707286 2560 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3a16a88-42b0-4b3f-876f-81950f78a844-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:59:11.707336 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:11.707296 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwjrf\" (UniqueName: \"kubernetes.io/projected/c3a16a88-42b0-4b3f-876f-81950f78a844-kube-api-access-hwjrf\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 07:59:12.370756 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:12.370728 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" Apr 17 07:59:12.370914 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:12.370727 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm74hk" event={"ID":"c3a16a88-42b0-4b3f-876f-81950f78a844","Type":"ContainerDied","Data":"1bdb4ccf9b7f6e51b3c19ba4af143cd51db6158eca4a7346e2d53e356220ae36"} Apr 17 07:59:12.370914 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:12.370833 2560 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdb4ccf9b7f6e51b3c19ba4af143cd51db6158eca4a7346e2d53e356220ae36" Apr 17 07:59:16.068048 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068012 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz"] Apr 17 07:59:16.068519 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068409 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="pull" Apr 17 07:59:16.068519 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068426 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="pull" Apr 17 07:59:16.068519 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068448 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="extract" Apr 17 07:59:16.068519 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068456 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="extract" Apr 17 07:59:16.068519 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068471 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="util" Apr 17 07:59:16.068519 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068480 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="util" Apr 17 07:59:16.068812 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.068557 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3a16a88-42b0-4b3f-876f-81950f78a844" containerName="extract" Apr 17 07:59:16.071717 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.071696 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.074415 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.074383 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 07:59:16.074538 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.074387 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 07:59:16.074538 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.074432 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 07:59:16.074538 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.074466 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lhk45\"" Apr 17 07:59:16.084349 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.084311 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz"] Apr 17 07:59:16.240716 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.240678 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww8gn\" (UniqueName: \"kubernetes.io/projected/a5c9cb3a-527c-411d-833b-d9f9239c30a6-kube-api-access-ww8gn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz\" (UID: \"a5c9cb3a-527c-411d-833b-d9f9239c30a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.240894 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.240728 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a5c9cb3a-527c-411d-833b-d9f9239c30a6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz\" (UID: \"a5c9cb3a-527c-411d-833b-d9f9239c30a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.342373 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.342282 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww8gn\" (UniqueName: \"kubernetes.io/projected/a5c9cb3a-527c-411d-833b-d9f9239c30a6-kube-api-access-ww8gn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz\" (UID: \"a5c9cb3a-527c-411d-833b-d9f9239c30a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.342373 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.342334 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a5c9cb3a-527c-411d-833b-d9f9239c30a6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz\" (UID: \"a5c9cb3a-527c-411d-833b-d9f9239c30a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.344725 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.344692 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a5c9cb3a-527c-411d-833b-d9f9239c30a6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz\" (UID: \"a5c9cb3a-527c-411d-833b-d9f9239c30a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.350051 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.350023 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww8gn\" (UniqueName: \"kubernetes.io/projected/a5c9cb3a-527c-411d-833b-d9f9239c30a6-kube-api-access-ww8gn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz\" (UID: \"a5c9cb3a-527c-411d-833b-d9f9239c30a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.381585 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.381555 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:16.515660 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:16.515554 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz"] Apr 17 07:59:16.518619 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:59:16.518594 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c9cb3a_527c_411d_833b_d9f9239c30a6.slice/crio-ae8a082c400e1ab9ec2e3a583cfef3e8c73bf7d31b6c1d7ae0efc2383cb49784 WatchSource:0}: Error finding container ae8a082c400e1ab9ec2e3a583cfef3e8c73bf7d31b6c1d7ae0efc2383cb49784: Status 404 returned error can't find the container with id ae8a082c400e1ab9ec2e3a583cfef3e8c73bf7d31b6c1d7ae0efc2383cb49784 Apr 17 07:59:17.387301 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:17.387264 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" event={"ID":"a5c9cb3a-527c-411d-833b-d9f9239c30a6","Type":"ContainerStarted","Data":"ae8a082c400e1ab9ec2e3a583cfef3e8c73bf7d31b6c1d7ae0efc2383cb49784"} Apr 17 07:59:20.408025 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.407958 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" event={"ID":"a5c9cb3a-527c-411d-833b-d9f9239c30a6","Type":"ContainerStarted","Data":"24fc596476d5d053102959a334967ae577270c0e1f916869f26e91da6c7b1117"} Apr 17 07:59:20.408410 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.408100 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:20.428639 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.428580 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" podStartSLOduration=1.112092928 podStartE2EDuration="4.428560782s" podCreationTimestamp="2026-04-17 07:59:16 +0000 UTC" firstStartedPulling="2026-04-17 07:59:16.520244306 +0000 UTC m=+415.007366092" lastFinishedPulling="2026-04-17 07:59:19.836712164 +0000 UTC m=+418.323833946" observedRunningTime="2026-04-17 07:59:20.42580862 +0000 UTC m=+418.912930424" watchObservedRunningTime="2026-04-17 07:59:20.428560782 +0000 UTC m=+418.915682585" Apr 17 07:59:20.665707 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.665629 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q"] Apr 17 07:59:20.669221 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.669198 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.675548 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.675416 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k5vgx\"" Apr 17 07:59:20.677745 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.677724 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 07:59:20.677852 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.677842 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 07:59:20.687874 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.687849 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q"] Apr 17 07:59:20.783040 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.783003 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2302cefd-c787-4b1e-868e-1857713bc642-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.783216 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.783051 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.783216 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.783101 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6vc9\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-kube-api-access-w6vc9\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.884090 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.884057 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2302cefd-c787-4b1e-868e-1857713bc642-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.884233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.884095 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.884233 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.884148 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6vc9\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-kube-api-access-w6vc9\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.884322 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:20.884283 2560 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:59:20.884322 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:20.884307 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:59:20.884382 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:20.884323 2560 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 07:59:20.884382 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:20.884342 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 07:59:20.884438 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:20.884401 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates podName:2302cefd-c787-4b1e-868e-1857713bc642 nodeName:}" failed. No retries permitted until 2026-04-17 07:59:21.384386679 +0000 UTC m=+419.871508460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates") pod "keda-metrics-apiserver-7c9f485588-8gn7q" (UID: "2302cefd-c787-4b1e-868e-1857713bc642") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 07:59:20.884530 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.884507 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2302cefd-c787-4b1e-868e-1857713bc642-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.888505 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.888483 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-7k8hp"] Apr 17 07:59:20.891733 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.891717 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:20.893932 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.893907 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 07:59:20.899084 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.899034 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6vc9\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-kube-api-access-w6vc9\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:20.900957 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:20.900928 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7k8hp"] Apr 17 07:59:21.085557 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.085522 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bda30002-3181-4cfe-a902-43f1649d5cd7-certificates\") pod \"keda-admission-cf49989db-7k8hp\" (UID: \"bda30002-3181-4cfe-a902-43f1649d5cd7\") " pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.085748 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.085598 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dm79\" (UniqueName: \"kubernetes.io/projected/bda30002-3181-4cfe-a902-43f1649d5cd7-kube-api-access-9dm79\") pod \"keda-admission-cf49989db-7k8hp\" (UID: \"bda30002-3181-4cfe-a902-43f1649d5cd7\") " pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.186672 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.186634 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dm79\" (UniqueName: \"kubernetes.io/projected/bda30002-3181-4cfe-a902-43f1649d5cd7-kube-api-access-9dm79\") pod \"keda-admission-cf49989db-7k8hp\" (UID: \"bda30002-3181-4cfe-a902-43f1649d5cd7\") " pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.186874 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.186732 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bda30002-3181-4cfe-a902-43f1649d5cd7-certificates\") pod \"keda-admission-cf49989db-7k8hp\" (UID: \"bda30002-3181-4cfe-a902-43f1649d5cd7\") " pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.189351 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.189324 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bda30002-3181-4cfe-a902-43f1649d5cd7-certificates\") pod \"keda-admission-cf49989db-7k8hp\" (UID: \"bda30002-3181-4cfe-a902-43f1649d5cd7\") " pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.198283 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.198262 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dm79\" (UniqueName: \"kubernetes.io/projected/bda30002-3181-4cfe-a902-43f1649d5cd7-kube-api-access-9dm79\") pod \"keda-admission-cf49989db-7k8hp\" (UID: \"bda30002-3181-4cfe-a902-43f1649d5cd7\") " pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.208096 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.208077 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:21.331430 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.331311 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7k8hp"] Apr 17 07:59:21.334083 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:59:21.334048 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda30002_3181_4cfe_a902_43f1649d5cd7.slice/crio-4fc5c437536f3327718042f3bbb9a99b047241bf9020e22d49a696a3c2117bff WatchSource:0}: Error finding container 4fc5c437536f3327718042f3bbb9a99b047241bf9020e22d49a696a3c2117bff: Status 404 returned error can't find the container with id 4fc5c437536f3327718042f3bbb9a99b047241bf9020e22d49a696a3c2117bff Apr 17 07:59:21.388506 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.388459 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:21.388670 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:21.388591 2560 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:59:21.388670 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:21.388616 2560 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:59:21.388670 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:21.388634 2560 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q: references non-existent secret key: tls.crt Apr 17 07:59:21.388762 ip-10-0-141-224 kubenswrapper[2560]: E0417 07:59:21.388686 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates podName:2302cefd-c787-4b1e-868e-1857713bc642 nodeName:}" failed. No retries permitted until 2026-04-17 07:59:22.388672856 +0000 UTC m=+420.875794636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates") pod "keda-metrics-apiserver-7c9f485588-8gn7q" (UID: "2302cefd-c787-4b1e-868e-1857713bc642") : references non-existent secret key: tls.crt Apr 17 07:59:21.411762 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:21.411724 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7k8hp" event={"ID":"bda30002-3181-4cfe-a902-43f1649d5cd7","Type":"ContainerStarted","Data":"4fc5c437536f3327718042f3bbb9a99b047241bf9020e22d49a696a3c2117bff"} Apr 17 07:59:22.398065 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:22.398024 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:22.400493 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:22.400471 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2302cefd-c787-4b1e-868e-1857713bc642-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8gn7q\" (UID: \"2302cefd-c787-4b1e-868e-1857713bc642\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:22.479041 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:22.478972 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:22.879383 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:22.879352 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q"] Apr 17 07:59:22.889061 ip-10-0-141-224 kubenswrapper[2560]: W0417 07:59:22.889028 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2302cefd_c787_4b1e_868e_1857713bc642.slice/crio-c4135aaf3c8a6406598a7eb277705ec3138cfb019466eb3d4fc78e493aad3d75 WatchSource:0}: Error finding container c4135aaf3c8a6406598a7eb277705ec3138cfb019466eb3d4fc78e493aad3d75: Status 404 returned error can't find the container with id c4135aaf3c8a6406598a7eb277705ec3138cfb019466eb3d4fc78e493aad3d75 Apr 17 07:59:23.419784 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:23.419740 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7k8hp" event={"ID":"bda30002-3181-4cfe-a902-43f1649d5cd7","Type":"ContainerStarted","Data":"9579187a4b74a13015fca95c6e2283801c68f8f624ea29ce65f29fa64bf9ee2f"} Apr 17 07:59:23.419969 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:23.419857 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 07:59:23.420787 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:23.420760 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" event={"ID":"2302cefd-c787-4b1e-868e-1857713bc642","Type":"ContainerStarted","Data":"c4135aaf3c8a6406598a7eb277705ec3138cfb019466eb3d4fc78e493aad3d75"} Apr 17 07:59:23.437835 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:23.437767 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-7k8hp" podStartSLOduration=1.944747237 podStartE2EDuration="3.437754826s" podCreationTimestamp="2026-04-17 07:59:20 +0000 UTC" firstStartedPulling="2026-04-17 07:59:21.335445822 +0000 UTC m=+419.822567606" lastFinishedPulling="2026-04-17 07:59:22.828453403 +0000 UTC m=+421.315575195" observedRunningTime="2026-04-17 07:59:23.436168967 +0000 UTC m=+421.923290770" watchObservedRunningTime="2026-04-17 07:59:23.437754826 +0000 UTC m=+421.924876659" Apr 17 07:59:26.433056 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:26.433016 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" event={"ID":"2302cefd-c787-4b1e-868e-1857713bc642","Type":"ContainerStarted","Data":"d36914ec480018fce3710ad056c6aad0173a485b863828d5d5cfb69b6cf068ad"} Apr 17 07:59:26.433455 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:26.433106 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:26.449614 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:26.449564 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" podStartSLOduration=3.507245831 podStartE2EDuration="6.449549655s" podCreationTimestamp="2026-04-17 07:59:20 +0000 UTC" firstStartedPulling="2026-04-17 07:59:22.890516737 +0000 UTC m=+421.377638517" lastFinishedPulling="2026-04-17 07:59:25.832820549 +0000 UTC m=+424.319942341" observedRunningTime="2026-04-17 07:59:26.448299875 +0000 UTC m=+424.935421678" watchObservedRunningTime="2026-04-17 07:59:26.449549655 +0000 UTC m=+424.936671457" Apr 17 07:59:37.440472 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:37.440441 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8gn7q" Apr 17 07:59:41.414813 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:41.414737 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-n8hzz" Apr 17 07:59:44.426721 ip-10-0-141-224 kubenswrapper[2560]: I0417 07:59:44.426688 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-7k8hp" Apr 17 08:00:28.077026 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.076974 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l"] Apr 17 08:00:28.080274 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.080255 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.084219 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.084189 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 08:00:28.084340 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.084277 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-wsvfb\"" Apr 17 08:00:28.085281 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.085260 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:00:28.085400 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.085313 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:00:28.097655 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.097628 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l"] Apr 17 08:00:28.141500 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.141468 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4446917-6447-409d-9c3d-d26288ad810a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.141660 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.141520 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvr2z\" (UniqueName: \"kubernetes.io/projected/b4446917-6447-409d-9c3d-d26288ad810a-kube-api-access-hvr2z\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.243119 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.243074 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4446917-6447-409d-9c3d-d26288ad810a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.243326 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:00:28.243199 2560 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 17 08:00:28.243326 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:00:28.243289 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4446917-6447-409d-9c3d-d26288ad810a-cert podName:b4446917-6447-409d-9c3d-d26288ad810a nodeName:}" failed. No retries permitted until 2026-04-17 08:00:28.74326627 +0000 UTC m=+487.230388066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4446917-6447-409d-9c3d-d26288ad810a-cert") pod "llmisvc-controller-manager-68cc5db7c4-2zm6l" (UID: "b4446917-6447-409d-9c3d-d26288ad810a") : secret "llmisvc-webhook-server-cert" not found Apr 17 08:00:28.243466 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.243386 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvr2z\" (UniqueName: \"kubernetes.io/projected/b4446917-6447-409d-9c3d-d26288ad810a-kube-api-access-hvr2z\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.253673 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.253648 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvr2z\" (UniqueName: \"kubernetes.io/projected/b4446917-6447-409d-9c3d-d26288ad810a-kube-api-access-hvr2z\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.747853 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.747816 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4446917-6447-409d-9c3d-d26288ad810a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.750285 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.750258 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4446917-6447-409d-9c3d-d26288ad810a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zm6l\" (UID: \"b4446917-6447-409d-9c3d-d26288ad810a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:28.990070 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:28.990034 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:29.105978 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:29.105882 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l"] Apr 17 08:00:29.108858 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:00:29.108824 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb4446917_6447_409d_9c3d_d26288ad810a.slice/crio-79c32cdb405944179b4173de67be42f800cf9e692a0b32dc198d6a44ea27c8c8 WatchSource:0}: Error finding container 79c32cdb405944179b4173de67be42f800cf9e692a0b32dc198d6a44ea27c8c8: Status 404 returned error can't find the container with id 79c32cdb405944179b4173de67be42f800cf9e692a0b32dc198d6a44ea27c8c8 Apr 17 08:00:29.634221 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:29.634184 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" event={"ID":"b4446917-6447-409d-9c3d-d26288ad810a","Type":"ContainerStarted","Data":"79c32cdb405944179b4173de67be42f800cf9e692a0b32dc198d6a44ea27c8c8"} Apr 17 08:00:37.666257 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:37.666213 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" event={"ID":"b4446917-6447-409d-9c3d-d26288ad810a","Type":"ContainerStarted","Data":"d286d7dee5ca333233088620184436e59cd1417ca670f6ce09999ec43d16b62e"} Apr 17 08:00:37.666651 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:37.666285 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:00:37.683527 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:00:37.683475 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" podStartSLOduration=1.60125546 podStartE2EDuration="9.683462726s" podCreationTimestamp="2026-04-17 08:00:28 +0000 UTC" firstStartedPulling="2026-04-17 08:00:29.110071881 +0000 UTC m=+487.597193663" lastFinishedPulling="2026-04-17 08:00:37.192279148 +0000 UTC m=+495.679400929" observedRunningTime="2026-04-17 08:00:37.681067991 +0000 UTC m=+496.168189795" watchObservedRunningTime="2026-04-17 08:00:37.683462726 +0000 UTC m=+496.170584528" Apr 17 08:01:08.672079 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:08.672048 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zm6l" Apr 17 08:01:43.262516 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.262422 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-9nkw8"] Apr 17 08:01:43.265707 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.265683 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.268217 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.268198 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-wr9rr\"" Apr 17 08:01:43.268317 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.268235 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 08:01:43.276573 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.276549 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9nkw8"] Apr 17 08:01:43.363339 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.363297 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52aaae27-68eb-4dd5-a27b-19d94d278505-tls-certs\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.363518 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.363355 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6mw\" (UniqueName: \"kubernetes.io/projected/52aaae27-68eb-4dd5-a27b-19d94d278505-kube-api-access-hm6mw\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.463959 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.463924 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52aaae27-68eb-4dd5-a27b-19d94d278505-tls-certs\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.464170 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.463969 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6mw\" (UniqueName: \"kubernetes.io/projected/52aaae27-68eb-4dd5-a27b-19d94d278505-kube-api-access-hm6mw\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.464170 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:01:43.464093 2560 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 08:01:43.464170 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:01:43.464170 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52aaae27-68eb-4dd5-a27b-19d94d278505-tls-certs podName:52aaae27-68eb-4dd5-a27b-19d94d278505 nodeName:}" failed. No retries permitted until 2026-04-17 08:01:43.964148935 +0000 UTC m=+562.451270720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/52aaae27-68eb-4dd5-a27b-19d94d278505-tls-certs") pod "model-serving-api-86f7b4b499-9nkw8" (UID: "52aaae27-68eb-4dd5-a27b-19d94d278505") : secret "model-serving-api-tls" not found Apr 17 08:01:43.472699 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.472670 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6mw\" (UniqueName: \"kubernetes.io/projected/52aaae27-68eb-4dd5-a27b-19d94d278505-kube-api-access-hm6mw\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.969046 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.968977 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52aaae27-68eb-4dd5-a27b-19d94d278505-tls-certs\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:43.971440 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:43.971415 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52aaae27-68eb-4dd5-a27b-19d94d278505-tls-certs\") pod \"model-serving-api-86f7b4b499-9nkw8\" (UID: \"52aaae27-68eb-4dd5-a27b-19d94d278505\") " pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:44.176584 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:44.176542 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:44.295387 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:44.295359 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9nkw8"] Apr 17 08:01:44.297808 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:01:44.297778 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52aaae27_68eb_4dd5_a27b_19d94d278505.slice/crio-fce44d648ce33a7768b246704b6285404a8809022f00826aafb0bd0034eef04f WatchSource:0}: Error finding container fce44d648ce33a7768b246704b6285404a8809022f00826aafb0bd0034eef04f: Status 404 returned error can't find the container with id fce44d648ce33a7768b246704b6285404a8809022f00826aafb0bd0034eef04f Apr 17 08:01:44.879229 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:44.879176 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9nkw8" event={"ID":"52aaae27-68eb-4dd5-a27b-19d94d278505","Type":"ContainerStarted","Data":"fce44d648ce33a7768b246704b6285404a8809022f00826aafb0bd0034eef04f"} Apr 17 08:01:46.886949 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:46.886912 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9nkw8" event={"ID":"52aaae27-68eb-4dd5-a27b-19d94d278505","Type":"ContainerStarted","Data":"5c979d4f8ef5195b277be3fe8f406aa8a0374c6cd4a36a308e0cc284eef954bb"} Apr 17 08:01:46.887322 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:46.887062 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:01:46.906840 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:46.906744 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-9nkw8" podStartSLOduration=1.56090021 podStartE2EDuration="3.906729795s" podCreationTimestamp="2026-04-17 08:01:43 +0000 UTC" firstStartedPulling="2026-04-17 08:01:44.299394605 +0000 UTC m=+562.786516389" lastFinishedPulling="2026-04-17 08:01:46.645224193 +0000 UTC m=+565.132345974" observedRunningTime="2026-04-17 08:01:46.905395809 +0000 UTC m=+565.392517613" watchObservedRunningTime="2026-04-17 08:01:46.906729795 +0000 UTC m=+565.393851597" Apr 17 08:01:57.895027 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:01:57.894981 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-9nkw8" Apr 17 08:03:38.915388 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.915354 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj"] Apr 17 08:03:38.918463 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.918447 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:38.920716 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.920691 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-08159-kube-rbac-proxy-sar-config\"" Apr 17 08:03:38.920716 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.920710 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:03:38.920896 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.920709 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:03:38.920896 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.920721 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-08159-serving-cert\"" Apr 17 08:03:38.925408 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.925383 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj"] Apr 17 08:03:38.999040 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.999006 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:38.999193 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:38.999112 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcee06dc-f1a7-46af-a694-5b4320c29862-openshift-service-ca-bundle\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.100253 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.100219 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.100435 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.100276 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcee06dc-f1a7-46af-a694-5b4320c29862-openshift-service-ca-bundle\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.100435 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:03:39.100381 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-08159-serving-cert: secret "switch-graph-08159-serving-cert" not found Apr 17 08:03:39.100547 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:03:39.100490 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls podName:fcee06dc-f1a7-46af-a694-5b4320c29862 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:39.600467132 +0000 UTC m=+678.087588948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls") pod "switch-graph-08159-8656fb4779-lqhtj" (UID: "fcee06dc-f1a7-46af-a694-5b4320c29862") : secret "switch-graph-08159-serving-cert" not found Apr 17 08:03:39.100807 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.100789 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcee06dc-f1a7-46af-a694-5b4320c29862-openshift-service-ca-bundle\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.604119 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.604071 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.606414 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.606394 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls\") pod \"switch-graph-08159-8656fb4779-lqhtj\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.828579 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.828526 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:39.944257 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.944233 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj"] Apr 17 08:03:39.946864 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:03:39.946831 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcee06dc_f1a7_46af_a694_5b4320c29862.slice/crio-ccf542641068653d3832af6ae562cd3bcdb3f8458f1121395316b866b2917059 WatchSource:0}: Error finding container ccf542641068653d3832af6ae562cd3bcdb3f8458f1121395316b866b2917059: Status 404 returned error can't find the container with id ccf542641068653d3832af6ae562cd3bcdb3f8458f1121395316b866b2917059 Apr 17 08:03:39.948567 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:39.948548 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:03:40.251248 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:40.251170 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" event={"ID":"fcee06dc-f1a7-46af-a694-5b4320c29862","Type":"ContainerStarted","Data":"ccf542641068653d3832af6ae562cd3bcdb3f8458f1121395316b866b2917059"} Apr 17 08:03:43.261769 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:43.261726 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" event={"ID":"fcee06dc-f1a7-46af-a694-5b4320c29862","Type":"ContainerStarted","Data":"4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f"} Apr 17 08:03:43.262239 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:43.261819 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:43.279293 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:43.279237 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podStartSLOduration=2.9855518439999997 podStartE2EDuration="5.279219393s" podCreationTimestamp="2026-04-17 08:03:38 +0000 UTC" firstStartedPulling="2026-04-17 08:03:39.948679727 +0000 UTC m=+678.435801508" lastFinishedPulling="2026-04-17 08:03:42.242347268 +0000 UTC m=+680.729469057" observedRunningTime="2026-04-17 08:03:43.276585977 +0000 UTC m=+681.763707782" watchObservedRunningTime="2026-04-17 08:03:43.279219393 +0000 UTC m=+681.766341198" Apr 17 08:03:49.270638 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:49.270603 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:03:53.193110 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:53.193078 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj"] Apr 17 08:03:53.193459 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:53.193333 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" containerID="cri-o://4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f" gracePeriod=30 Apr 17 08:03:54.268643 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:54.268605 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:03:59.270144 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:03:59.270102 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:04.269641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:04.269601 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:04.270118 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:04.269707 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:04:09.269264 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:09.269219 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:14.268853 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:14.268772 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:18.857651 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.857614 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc"] Apr 17 08:04:18.864100 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.864078 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:18.866427 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.866406 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 17 08:04:18.866554 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.866406 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 17 08:04:18.867175 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.867152 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc"] Apr 17 08:04:18.912980 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.912953 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5b1293-930c-4565-b12f-3ee12905cd3e-openshift-service-ca-bundle\") pod \"model-chainer-7b7cd8db88-p4crc\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:18.913093 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:18.913066 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5b1293-930c-4565-b12f-3ee12905cd3e-proxy-tls\") pod \"model-chainer-7b7cd8db88-p4crc\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:19.014286 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.014247 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5b1293-930c-4565-b12f-3ee12905cd3e-openshift-service-ca-bundle\") pod \"model-chainer-7b7cd8db88-p4crc\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:19.014475 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.014323 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5b1293-930c-4565-b12f-3ee12905cd3e-proxy-tls\") pod \"model-chainer-7b7cd8db88-p4crc\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:19.014872 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.014848 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5b1293-930c-4565-b12f-3ee12905cd3e-openshift-service-ca-bundle\") pod \"model-chainer-7b7cd8db88-p4crc\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:19.016812 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.016784 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5b1293-930c-4565-b12f-3ee12905cd3e-proxy-tls\") pod \"model-chainer-7b7cd8db88-p4crc\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:19.175752 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.175669 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:19.269797 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.269752 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:19.293202 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.293173 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc"] Apr 17 08:04:19.296234 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:04:19.296207 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5b1293_930c_4565_b12f_3ee12905cd3e.slice/crio-6764c65fcdf35fb051db91a393b873a4b1a8607f27cc2f5a550686b448936868 WatchSource:0}: Error finding container 6764c65fcdf35fb051db91a393b873a4b1a8607f27cc2f5a550686b448936868: Status 404 returned error can't find the container with id 6764c65fcdf35fb051db91a393b873a4b1a8607f27cc2f5a550686b448936868 Apr 17 08:04:19.380218 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:19.380183 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" event={"ID":"3c5b1293-930c-4565-b12f-3ee12905cd3e","Type":"ContainerStarted","Data":"6764c65fcdf35fb051db91a393b873a4b1a8607f27cc2f5a550686b448936868"} Apr 17 08:04:20.384047 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:20.384006 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" event={"ID":"3c5b1293-930c-4565-b12f-3ee12905cd3e","Type":"ContainerStarted","Data":"077c3c705f4b5665968073eb2056ab5e9d0c641ef7c43910428804e39fbb414d"} Apr 17 08:04:20.384398 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:20.384120 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:20.400355 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:20.400315 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podStartSLOduration=2.400301046 podStartE2EDuration="2.400301046s" podCreationTimestamp="2026-04-17 08:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:04:20.39785818 +0000 UTC m=+718.884979982" watchObservedRunningTime="2026-04-17 08:04:20.400301046 +0000 UTC m=+718.887422848" Apr 17 08:04:23.328039 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.328018 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:04:23.395251 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.395207 2560 generic.go:358] "Generic (PLEG): container finished" podID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerID="4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f" exitCode=0 Apr 17 08:04:23.395407 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.395272 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" Apr 17 08:04:23.395407 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.395276 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" event={"ID":"fcee06dc-f1a7-46af-a694-5b4320c29862","Type":"ContainerDied","Data":"4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f"} Apr 17 08:04:23.395407 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.395315 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj" event={"ID":"fcee06dc-f1a7-46af-a694-5b4320c29862","Type":"ContainerDied","Data":"ccf542641068653d3832af6ae562cd3bcdb3f8458f1121395316b866b2917059"} Apr 17 08:04:23.395407 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.395331 2560 scope.go:117] "RemoveContainer" containerID="4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f" Apr 17 08:04:23.403173 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.403158 2560 scope.go:117] "RemoveContainer" containerID="4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f" Apr 17 08:04:23.403420 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:04:23.403404 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f\": container with ID starting with 4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f not found: ID does not exist" containerID="4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f" Apr 17 08:04:23.403482 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.403427 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f"} err="failed to get container status \"4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f\": rpc error: code = NotFound desc = could not find container \"4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f\": container with ID starting with 4fc184c1e02972e32371b98a4f7adb182c0851a0110d1393429e248039e24d9f not found: ID does not exist" Apr 17 08:04:23.453767 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.453702 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls\") pod \"fcee06dc-f1a7-46af-a694-5b4320c29862\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " Apr 17 08:04:23.453866 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.453781 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcee06dc-f1a7-46af-a694-5b4320c29862-openshift-service-ca-bundle\") pod \"fcee06dc-f1a7-46af-a694-5b4320c29862\" (UID: \"fcee06dc-f1a7-46af-a694-5b4320c29862\") " Apr 17 08:04:23.454171 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.454146 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcee06dc-f1a7-46af-a694-5b4320c29862-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fcee06dc-f1a7-46af-a694-5b4320c29862" (UID: "fcee06dc-f1a7-46af-a694-5b4320c29862"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:04:23.455816 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.455796 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fcee06dc-f1a7-46af-a694-5b4320c29862" (UID: "fcee06dc-f1a7-46af-a694-5b4320c29862"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:04:23.555251 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.555211 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcee06dc-f1a7-46af-a694-5b4320c29862-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:04:23.555251 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.555247 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcee06dc-f1a7-46af-a694-5b4320c29862-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:04:23.716033 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.715948 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj"] Apr 17 08:04:23.719006 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:23.718964 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-08159-8656fb4779-lqhtj"] Apr 17 08:04:24.134933 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:24.134901 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" path="/var/lib/kubelet/pods/fcee06dc-f1a7-46af-a694-5b4320c29862/volumes" Apr 17 08:04:26.393817 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:26.393785 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:28.989042 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:28.989007 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc"] Apr 17 08:04:28.989400 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:28.989237 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" containerID="cri-o://077c3c705f4b5665968073eb2056ab5e9d0c641ef7c43910428804e39fbb414d" gracePeriod=30 Apr 17 08:04:31.391891 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:31.391851 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:36.391698 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:36.391658 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:41.391933 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:41.391894 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:41.392384 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:41.392032 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:46.392052 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:46.391980 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:51.391750 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:51.391710 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:56.391929 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:56.391889 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:04:59.512784 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.512755 2560 generic.go:358] "Generic (PLEG): container finished" podID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerID="077c3c705f4b5665968073eb2056ab5e9d0c641ef7c43910428804e39fbb414d" exitCode=0 Apr 17 08:04:59.513118 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.512815 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" event={"ID":"3c5b1293-930c-4565-b12f-3ee12905cd3e","Type":"ContainerDied","Data":"077c3c705f4b5665968073eb2056ab5e9d0c641ef7c43910428804e39fbb414d"} Apr 17 08:04:59.628671 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.628648 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:04:59.754951 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.754873 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5b1293-930c-4565-b12f-3ee12905cd3e-openshift-service-ca-bundle\") pod \"3c5b1293-930c-4565-b12f-3ee12905cd3e\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " Apr 17 08:04:59.755094 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.754974 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5b1293-930c-4565-b12f-3ee12905cd3e-proxy-tls\") pod \"3c5b1293-930c-4565-b12f-3ee12905cd3e\" (UID: \"3c5b1293-930c-4565-b12f-3ee12905cd3e\") " Apr 17 08:04:59.755251 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.755224 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5b1293-930c-4565-b12f-3ee12905cd3e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3c5b1293-930c-4565-b12f-3ee12905cd3e" (UID: "3c5b1293-930c-4565-b12f-3ee12905cd3e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:04:59.756897 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.756877 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5b1293-930c-4565-b12f-3ee12905cd3e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3c5b1293-930c-4565-b12f-3ee12905cd3e" (UID: "3c5b1293-930c-4565-b12f-3ee12905cd3e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:04:59.856311 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.856276 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5b1293-930c-4565-b12f-3ee12905cd3e-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:04:59.856311 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:04:59.856305 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5b1293-930c-4565-b12f-3ee12905cd3e-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:05:00.517062 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:00.517024 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" event={"ID":"3c5b1293-930c-4565-b12f-3ee12905cd3e","Type":"ContainerDied","Data":"6764c65fcdf35fb051db91a393b873a4b1a8607f27cc2f5a550686b448936868"} Apr 17 08:05:00.517540 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:00.517072 2560 scope.go:117] "RemoveContainer" containerID="077c3c705f4b5665968073eb2056ab5e9d0c641ef7c43910428804e39fbb414d" Apr 17 08:05:00.517540 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:00.517076 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc" Apr 17 08:05:00.532013 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:00.531974 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc"] Apr 17 08:05:00.533949 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:00.533924 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7b7cd8db88-p4crc"] Apr 17 08:05:02.133012 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:02.132956 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" path="/var/lib/kubelet/pods/3c5b1293-930c-4565-b12f-3ee12905cd3e/volumes" Apr 17 08:05:03.433361 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433318 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw"] Apr 17 08:05:03.433827 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433803 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" Apr 17 08:05:03.433911 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433830 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" Apr 17 08:05:03.433911 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433845 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" Apr 17 08:05:03.433911 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433854 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" Apr 17 08:05:03.434108 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433957 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcee06dc-f1a7-46af-a694-5b4320c29862" containerName="switch-graph-08159" Apr 17 08:05:03.434108 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.433980 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c5b1293-930c-4565-b12f-3ee12905cd3e" containerName="model-chainer" Apr 17 08:05:03.438349 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.438326 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:03.440691 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.440669 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-4828d-serving-cert\"" Apr 17 08:05:03.440818 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.440784 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:05:03.441552 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.441532 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-4828d-kube-rbac-proxy-sar-config\"" Apr 17 08:05:03.441638 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.441547 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:05:03.446321 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.446298 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw"] Apr 17 08:05:03.485714 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.485658 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdaad542-c3c4-449a-b6bb-7e2448ba506a-openshift-service-ca-bundle\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:03.485714 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.485713 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:03.586830 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.586791 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdaad542-c3c4-449a-b6bb-7e2448ba506a-openshift-service-ca-bundle\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:03.586830 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.586833 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:03.587126 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:05:03.586926 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-4828d-serving-cert: secret "switch-graph-4828d-serving-cert" not found Apr 17 08:05:03.587126 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:05:03.587020 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls podName:bdaad542-c3c4-449a-b6bb-7e2448ba506a nodeName:}" failed. No retries permitted until 2026-04-17 08:05:04.086975024 +0000 UTC m=+762.574096804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls") pod "switch-graph-4828d-85cdb4b486-9xjzw" (UID: "bdaad542-c3c4-449a-b6bb-7e2448ba506a") : secret "switch-graph-4828d-serving-cert" not found Apr 17 08:05:03.587537 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:03.587512 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdaad542-c3c4-449a-b6bb-7e2448ba506a-openshift-service-ca-bundle\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:04.090078 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.090042 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:04.092376 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.092346 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls\") pod \"switch-graph-4828d-85cdb4b486-9xjzw\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:04.348591 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.348493 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:04.465245 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.465189 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw"] Apr 17 08:05:04.468002 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:05:04.467960 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdaad542_c3c4_449a_b6bb_7e2448ba506a.slice/crio-a240377aa4c6e3648ec5b01e5143ea83f8e913b38668e22b60ccc83f8a72020e WatchSource:0}: Error finding container a240377aa4c6e3648ec5b01e5143ea83f8e913b38668e22b60ccc83f8a72020e: Status 404 returned error can't find the container with id a240377aa4c6e3648ec5b01e5143ea83f8e913b38668e22b60ccc83f8a72020e Apr 17 08:05:04.530376 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.530347 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" event={"ID":"bdaad542-c3c4-449a-b6bb-7e2448ba506a","Type":"ContainerStarted","Data":"8817d88a13ba790932943b40a2232f6a1731d33a4eed25d3a4648a7d9ebb3ea4"} Apr 17 08:05:04.530492 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.530386 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" event={"ID":"bdaad542-c3c4-449a-b6bb-7e2448ba506a","Type":"ContainerStarted","Data":"a240377aa4c6e3648ec5b01e5143ea83f8e913b38668e22b60ccc83f8a72020e"} Apr 17 08:05:04.530492 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.530420 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:04.545520 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:04.545463 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podStartSLOduration=1.5454427480000001 podStartE2EDuration="1.545442748s" podCreationTimestamp="2026-04-17 08:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:05:04.543458452 +0000 UTC m=+763.030580256" watchObservedRunningTime="2026-04-17 08:05:04.545442748 +0000 UTC m=+763.032564556" Apr 17 08:05:10.539232 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:10.539203 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:05:39.176099 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.176053 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh"] Apr 17 08:05:39.180618 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.180598 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.182699 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.182681 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c3bdf-kube-rbac-proxy-sar-config\"" Apr 17 08:05:39.182780 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.182698 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c3bdf-serving-cert\"" Apr 17 08:05:39.187229 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.187204 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh"] Apr 17 08:05:39.253804 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.253776 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.253975 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.253829 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31c24e7-f6c3-43bc-98b0-1e614dff8438-openshift-service-ca-bundle\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.355135 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.355098 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.355292 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.355154 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31c24e7-f6c3-43bc-98b0-1e614dff8438-openshift-service-ca-bundle\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.355292 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:05:39.355240 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-c3bdf-serving-cert: secret "sequence-graph-c3bdf-serving-cert" not found Apr 17 08:05:39.355379 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:05:39.355311 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls podName:d31c24e7-f6c3-43bc-98b0-1e614dff8438 nodeName:}" failed. No retries permitted until 2026-04-17 08:05:39.855294728 +0000 UTC m=+798.342416509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls") pod "sequence-graph-c3bdf-57f79946fd-dp8vh" (UID: "d31c24e7-f6c3-43bc-98b0-1e614dff8438") : secret "sequence-graph-c3bdf-serving-cert" not found Apr 17 08:05:39.355757 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.355740 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31c24e7-f6c3-43bc-98b0-1e614dff8438-openshift-service-ca-bundle\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.859865 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.859831 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:39.862119 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:39.862092 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls\") pod \"sequence-graph-c3bdf-57f79946fd-dp8vh\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:40.091310 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:40.091265 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:40.207570 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:40.207546 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh"] Apr 17 08:05:40.210105 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:05:40.210078 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd31c24e7_f6c3_43bc_98b0_1e614dff8438.slice/crio-461411e3ad129c03ab1aa6702e4ca4eda2bc60259e44e4c65a47824e55735b7f WatchSource:0}: Error finding container 461411e3ad129c03ab1aa6702e4ca4eda2bc60259e44e4c65a47824e55735b7f: Status 404 returned error can't find the container with id 461411e3ad129c03ab1aa6702e4ca4eda2bc60259e44e4c65a47824e55735b7f Apr 17 08:05:40.648037 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:40.647983 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" event={"ID":"d31c24e7-f6c3-43bc-98b0-1e614dff8438","Type":"ContainerStarted","Data":"c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf"} Apr 17 08:05:40.648037 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:40.648037 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" event={"ID":"d31c24e7-f6c3-43bc-98b0-1e614dff8438","Type":"ContainerStarted","Data":"461411e3ad129c03ab1aa6702e4ca4eda2bc60259e44e4c65a47824e55735b7f"} Apr 17 08:05:40.648241 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:40.648120 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:05:40.662964 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:40.662906 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podStartSLOduration=1.662889952 podStartE2EDuration="1.662889952s" podCreationTimestamp="2026-04-17 08:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:05:40.662139164 +0000 UTC m=+799.149260968" watchObservedRunningTime="2026-04-17 08:05:40.662889952 +0000 UTC m=+799.150011759" Apr 17 08:05:46.656667 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:05:46.656635 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:13:18.144430 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:18.144348 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw"] Apr 17 08:13:18.144968 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:18.144644 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" containerID="cri-o://8817d88a13ba790932943b40a2232f6a1731d33a4eed25d3a4648a7d9ebb3ea4" gracePeriod=30 Apr 17 08:13:20.538652 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:20.538611 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:13:25.538426 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:25.538384 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:13:30.538746 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:30.538706 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:13:30.539197 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:30.538834 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:13:35.538114 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:35.538070 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:13:40.538335 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:40.538296 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:13:45.538024 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:45.537964 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:13:48.216505 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.216475 2560 generic.go:358] "Generic (PLEG): container finished" podID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerID="8817d88a13ba790932943b40a2232f6a1731d33a4eed25d3a4648a7d9ebb3ea4" exitCode=0 Apr 17 08:13:48.216907 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.216513 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" event={"ID":"bdaad542-c3c4-449a-b6bb-7e2448ba506a","Type":"ContainerDied","Data":"8817d88a13ba790932943b40a2232f6a1731d33a4eed25d3a4648a7d9ebb3ea4"} Apr 17 08:13:48.280686 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.280663 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:13:48.429839 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.429750 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdaad542-c3c4-449a-b6bb-7e2448ba506a-openshift-service-ca-bundle\") pod \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " Apr 17 08:13:48.429839 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.429832 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls\") pod \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\" (UID: \"bdaad542-c3c4-449a-b6bb-7e2448ba506a\") " Apr 17 08:13:48.430181 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.430158 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaad542-c3c4-449a-b6bb-7e2448ba506a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bdaad542-c3c4-449a-b6bb-7e2448ba506a" (UID: "bdaad542-c3c4-449a-b6bb-7e2448ba506a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:13:48.431885 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.431852 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bdaad542-c3c4-449a-b6bb-7e2448ba506a" (UID: "bdaad542-c3c4-449a-b6bb-7e2448ba506a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:13:48.530762 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.530727 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdaad542-c3c4-449a-b6bb-7e2448ba506a-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:13:48.530762 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:48.530756 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdaad542-c3c4-449a-b6bb-7e2448ba506a-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:13:49.221180 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:49.221152 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" Apr 17 08:13:49.221641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:49.221146 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw" event={"ID":"bdaad542-c3c4-449a-b6bb-7e2448ba506a","Type":"ContainerDied","Data":"a240377aa4c6e3648ec5b01e5143ea83f8e913b38668e22b60ccc83f8a72020e"} Apr 17 08:13:49.221641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:49.221289 2560 scope.go:117] "RemoveContainer" containerID="8817d88a13ba790932943b40a2232f6a1731d33a4eed25d3a4648a7d9ebb3ea4" Apr 17 08:13:49.240610 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:49.240579 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw"] Apr 17 08:13:49.244423 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:49.244401 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4828d-85cdb4b486-9xjzw"] Apr 17 08:13:50.132938 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:50.132902 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" path="/var/lib/kubelet/pods/bdaad542-c3c4-449a-b6bb-7e2448ba506a/volumes" Apr 17 08:13:53.962515 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:53.962477 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh"] Apr 17 08:13:53.962886 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:53.962735 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" containerID="cri-o://c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf" gracePeriod=30 Apr 17 08:13:56.654890 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:13:56.654844 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:01.654731 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:01.654682 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:06.654620 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:06.654573 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:06.655016 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:06.654688 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:14:11.654692 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:11.654645 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:16.654307 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:16.654269 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:21.654663 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:21.654623 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:24.105925 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.105897 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:14:24.209325 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.209294 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31c24e7-f6c3-43bc-98b0-1e614dff8438-openshift-service-ca-bundle\") pod \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " Apr 17 08:14:24.209502 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.209334 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls\") pod \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\" (UID: \"d31c24e7-f6c3-43bc-98b0-1e614dff8438\") " Apr 17 08:14:24.209672 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.209647 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31c24e7-f6c3-43bc-98b0-1e614dff8438-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d31c24e7-f6c3-43bc-98b0-1e614dff8438" (UID: "d31c24e7-f6c3-43bc-98b0-1e614dff8438"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:14:24.211485 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.211456 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d31c24e7-f6c3-43bc-98b0-1e614dff8438" (UID: "d31c24e7-f6c3-43bc-98b0-1e614dff8438"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:14:24.310122 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.310033 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31c24e7-f6c3-43bc-98b0-1e614dff8438-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:14:24.310122 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.310066 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d31c24e7-f6c3-43bc-98b0-1e614dff8438-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:14:24.336055 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.336022 2560 generic.go:358] "Generic (PLEG): container finished" podID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerID="c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf" exitCode=0 Apr 17 08:14:24.336201 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.336073 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" event={"ID":"d31c24e7-f6c3-43bc-98b0-1e614dff8438","Type":"ContainerDied","Data":"c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf"} Apr 17 08:14:24.336201 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.336123 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" event={"ID":"d31c24e7-f6c3-43bc-98b0-1e614dff8438","Type":"ContainerDied","Data":"461411e3ad129c03ab1aa6702e4ca4eda2bc60259e44e4c65a47824e55735b7f"} Apr 17 08:14:24.336201 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.336143 2560 scope.go:117] "RemoveContainer" containerID="c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf" Apr 17 08:14:24.336201 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.336088 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh" Apr 17 08:14:24.343814 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.343793 2560 scope.go:117] "RemoveContainer" containerID="c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf" Apr 17 08:14:24.344080 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:14:24.344057 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf\": container with ID starting with c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf not found: ID does not exist" containerID="c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf" Apr 17 08:14:24.344165 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.344091 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf"} err="failed to get container status \"c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf\": rpc error: code = NotFound desc = could not find container \"c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf\": container with ID starting with c3957aa3a4b6b12cc0481bb171a4aaab8478526aa5f7e757ca5a04cc9c52faaf not found: ID does not exist" Apr 17 08:14:24.355400 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.355375 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh"] Apr 17 08:14:24.359231 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:24.359210 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c3bdf-57f79946fd-dp8vh"] Apr 17 08:14:26.133376 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:26.133343 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" path="/var/lib/kubelet/pods/d31c24e7-f6c3-43bc-98b0-1e614dff8438/volumes" Apr 17 08:14:28.391524 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391489 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k"] Apr 17 08:14:28.391900 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391806 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" Apr 17 08:14:28.391900 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391816 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" Apr 17 08:14:28.391900 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391828 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" Apr 17 08:14:28.391900 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391833 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" Apr 17 08:14:28.391900 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391882 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdaad542-c3c4-449a-b6bb-7e2448ba506a" containerName="switch-graph-4828d" Apr 17 08:14:28.391900 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.391890 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="d31c24e7-f6c3-43bc-98b0-1e614dff8438" containerName="sequence-graph-c3bdf" Apr 17 08:14:28.395933 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.395915 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:28.398259 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.398236 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-390bf-kube-rbac-proxy-sar-config\"" Apr 17 08:14:28.398387 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.398236 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-390bf-serving-cert\"" Apr 17 08:14:28.398454 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.398431 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:14:28.398454 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.398431 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:14:28.402350 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.402328 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k"] Apr 17 08:14:28.443698 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.443663 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:28.443873 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.443732 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7acd7c46-18d3-4fa4-9697-1f83492ffd81-openshift-service-ca-bundle\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:28.544530 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.544485 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:28.544724 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.544573 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7acd7c46-18d3-4fa4-9697-1f83492ffd81-openshift-service-ca-bundle\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:28.544724 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:14:28.544629 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-390bf-serving-cert: secret "ensemble-graph-390bf-serving-cert" not found Apr 17 08:14:28.544724 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:14:28.544690 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls podName:7acd7c46-18d3-4fa4-9697-1f83492ffd81 nodeName:}" failed. No retries permitted until 2026-04-17 08:14:29.044674677 +0000 UTC m=+1327.531796459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls") pod "ensemble-graph-390bf-c6ccb5b76-jqf2k" (UID: "7acd7c46-18d3-4fa4-9697-1f83492ffd81") : secret "ensemble-graph-390bf-serving-cert" not found Apr 17 08:14:28.545266 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:28.545242 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7acd7c46-18d3-4fa4-9697-1f83492ffd81-openshift-service-ca-bundle\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:29.049350 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:29.049304 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:29.051785 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:29.051753 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls\") pod \"ensemble-graph-390bf-c6ccb5b76-jqf2k\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:29.306964 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:29.306876 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:29.424705 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:29.424674 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k"] Apr 17 08:14:29.427980 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:14:29.427942 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7acd7c46_18d3_4fa4_9697_1f83492ffd81.slice/crio-db2026b406fa145c94d2f2337122899435c7c2c01a43ec3a02e4f8c3612ba3cf WatchSource:0}: Error finding container db2026b406fa145c94d2f2337122899435c7c2c01a43ec3a02e4f8c3612ba3cf: Status 404 returned error can't find the container with id db2026b406fa145c94d2f2337122899435c7c2c01a43ec3a02e4f8c3612ba3cf Apr 17 08:14:29.430237 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:29.430216 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:14:30.357608 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:30.357575 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" event={"ID":"7acd7c46-18d3-4fa4-9697-1f83492ffd81","Type":"ContainerStarted","Data":"413a870c2fd22c2ea8b9b944701ae078864a17909196cca72c667798f0ebf579"} Apr 17 08:14:30.357608 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:30.357614 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" event={"ID":"7acd7c46-18d3-4fa4-9697-1f83492ffd81","Type":"ContainerStarted","Data":"db2026b406fa145c94d2f2337122899435c7c2c01a43ec3a02e4f8c3612ba3cf"} Apr 17 08:14:30.357813 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:30.357694 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:30.372532 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:30.372482 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podStartSLOduration=2.37246571 podStartE2EDuration="2.37246571s" podCreationTimestamp="2026-04-17 08:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:14:30.371687967 +0000 UTC m=+1328.858809769" watchObservedRunningTime="2026-04-17 08:14:30.37246571 +0000 UTC m=+1328.859587515" Apr 17 08:14:36.368753 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:36.368723 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:38.453420 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:38.453388 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k"] Apr 17 08:14:38.453772 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:38.453595 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" containerID="cri-o://413a870c2fd22c2ea8b9b944701ae078864a17909196cca72c667798f0ebf579" gracePeriod=30 Apr 17 08:14:41.365948 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:41.365901 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:46.366821 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:46.366772 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:51.366259 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:51.366221 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:14:51.366694 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:51.366345 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:14:56.366611 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:14:56.366570 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:01.366219 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:01.366180 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:04.144741 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.144710 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t"] Apr 17 08:15:04.148017 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.147979 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.150088 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.150062 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-375fe-kube-rbac-proxy-sar-config\"" Apr 17 08:15:04.150181 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.150149 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-375fe-serving-cert\"" Apr 17 08:15:04.155939 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.155916 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t"] Apr 17 08:15:04.249025 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.248966 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.249233 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.249123 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d64e4e4-969c-441f-932a-9610f8b815d0-openshift-service-ca-bundle\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.350375 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.350330 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d64e4e4-969c-441f-932a-9610f8b815d0-openshift-service-ca-bundle\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.350573 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.350397 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.350573 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:15:04.350494 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-375fe-serving-cert: secret "sequence-graph-375fe-serving-cert" not found Apr 17 08:15:04.350573 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:15:04.350547 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls podName:5d64e4e4-969c-441f-932a-9610f8b815d0 nodeName:}" failed. No retries permitted until 2026-04-17 08:15:04.850531122 +0000 UTC m=+1363.337652904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls") pod "sequence-graph-375fe-5b588c76f8-lm94t" (UID: "5d64e4e4-969c-441f-932a-9610f8b815d0") : secret "sequence-graph-375fe-serving-cert" not found Apr 17 08:15:04.351106 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.351079 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d64e4e4-969c-441f-932a-9610f8b815d0-openshift-service-ca-bundle\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.855405 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.855343 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:04.857773 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:04.857751 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls\") pod \"sequence-graph-375fe-5b588c76f8-lm94t\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:05.058707 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:05.058672 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:05.176422 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:05.176390 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t"] Apr 17 08:15:05.179509 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:15:05.179483 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d64e4e4_969c_441f_932a_9610f8b815d0.slice/crio-bd16bfd44fbf9cc5473e81fec1c234b6a07ff77abe2deedced51a3751ea5dc46 WatchSource:0}: Error finding container bd16bfd44fbf9cc5473e81fec1c234b6a07ff77abe2deedced51a3751ea5dc46: Status 404 returned error can't find the container with id bd16bfd44fbf9cc5473e81fec1c234b6a07ff77abe2deedced51a3751ea5dc46 Apr 17 08:15:05.478886 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:05.478790 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" event={"ID":"5d64e4e4-969c-441f-932a-9610f8b815d0","Type":"ContainerStarted","Data":"3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae"} Apr 17 08:15:05.478886 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:05.478825 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" event={"ID":"5d64e4e4-969c-441f-932a-9610f8b815d0","Type":"ContainerStarted","Data":"bd16bfd44fbf9cc5473e81fec1c234b6a07ff77abe2deedced51a3751ea5dc46"} Apr 17 08:15:05.479128 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:05.478949 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:05.494146 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:05.494099 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podStartSLOduration=1.494085505 podStartE2EDuration="1.494085505s" podCreationTimestamp="2026-04-17 08:15:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:15:05.49348024 +0000 UTC m=+1363.980602057" watchObservedRunningTime="2026-04-17 08:15:05.494085505 +0000 UTC m=+1363.981207313" Apr 17 08:15:06.366738 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:06.366701 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:08.490838 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.490809 2560 generic.go:358] "Generic (PLEG): container finished" podID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerID="413a870c2fd22c2ea8b9b944701ae078864a17909196cca72c667798f0ebf579" exitCode=0 Apr 17 08:15:08.491179 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.490881 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" event={"ID":"7acd7c46-18d3-4fa4-9697-1f83492ffd81","Type":"ContainerDied","Data":"413a870c2fd22c2ea8b9b944701ae078864a17909196cca72c667798f0ebf579"} Apr 17 08:15:08.592607 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.592585 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:15:08.790121 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.790024 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7acd7c46-18d3-4fa4-9697-1f83492ffd81-openshift-service-ca-bundle\") pod \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " Apr 17 08:15:08.790121 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.790099 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls\") pod \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\" (UID: \"7acd7c46-18d3-4fa4-9697-1f83492ffd81\") " Apr 17 08:15:08.790462 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.790436 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7acd7c46-18d3-4fa4-9697-1f83492ffd81-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7acd7c46-18d3-4fa4-9697-1f83492ffd81" (UID: "7acd7c46-18d3-4fa4-9697-1f83492ffd81"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:15:08.792121 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.792095 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7acd7c46-18d3-4fa4-9697-1f83492ffd81" (UID: "7acd7c46-18d3-4fa4-9697-1f83492ffd81"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:15:08.890963 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.890925 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7acd7c46-18d3-4fa4-9697-1f83492ffd81-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:15:08.890963 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:08.890957 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7acd7c46-18d3-4fa4-9697-1f83492ffd81-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:15:09.495617 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:09.495584 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" event={"ID":"7acd7c46-18d3-4fa4-9697-1f83492ffd81","Type":"ContainerDied","Data":"db2026b406fa145c94d2f2337122899435c7c2c01a43ec3a02e4f8c3612ba3cf"} Apr 17 08:15:09.495617 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:09.495613 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k" Apr 17 08:15:09.496138 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:09.495630 2560 scope.go:117] "RemoveContainer" containerID="413a870c2fd22c2ea8b9b944701ae078864a17909196cca72c667798f0ebf579" Apr 17 08:15:09.515507 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:09.515478 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k"] Apr 17 08:15:09.518836 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:09.518814 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-390bf-c6ccb5b76-jqf2k"] Apr 17 08:15:10.132374 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:10.132333 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" path="/var/lib/kubelet/pods/7acd7c46-18d3-4fa4-9697-1f83492ffd81/volumes" Apr 17 08:15:11.488521 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:11.488488 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:14.229014 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:14.228966 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t"] Apr 17 08:15:14.229373 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:14.229229 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" containerID="cri-o://3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae" gracePeriod=30 Apr 17 08:15:16.487137 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:16.487098 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:21.486863 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:21.486816 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:26.486794 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:26.486748 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:26.487191 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:26.486860 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:31.486724 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:31.486685 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:36.486924 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:36.486880 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:41.487180 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:41.487142 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:15:44.258404 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:15:44.258370 2560 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d64e4e4_969c_441f_932a_9610f8b815d0.slice/crio-conmon-3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae.scope\": RecentStats: unable to find data in memory cache]" Apr 17 08:15:44.363631 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.363602 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:44.476940 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.476904 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls\") pod \"5d64e4e4-969c-441f-932a-9610f8b815d0\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " Apr 17 08:15:44.477136 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.476954 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d64e4e4-969c-441f-932a-9610f8b815d0-openshift-service-ca-bundle\") pod \"5d64e4e4-969c-441f-932a-9610f8b815d0\" (UID: \"5d64e4e4-969c-441f-932a-9610f8b815d0\") " Apr 17 08:15:44.477385 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.477356 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d64e4e4-969c-441f-932a-9610f8b815d0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5d64e4e4-969c-441f-932a-9610f8b815d0" (UID: "5d64e4e4-969c-441f-932a-9610f8b815d0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:15:44.478976 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.478949 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d64e4e4-969c-441f-932a-9610f8b815d0" (UID: "5d64e4e4-969c-441f-932a-9610f8b815d0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:15:44.578050 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.578010 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d64e4e4-969c-441f-932a-9610f8b815d0-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:15:44.578050 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.578046 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d64e4e4-969c-441f-932a-9610f8b815d0-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:15:44.606496 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.606454 2560 generic.go:358] "Generic (PLEG): container finished" podID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerID="3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae" exitCode=0 Apr 17 08:15:44.606641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.606568 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" Apr 17 08:15:44.606641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.606575 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" event={"ID":"5d64e4e4-969c-441f-932a-9610f8b815d0","Type":"ContainerDied","Data":"3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae"} Apr 17 08:15:44.606641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.606620 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t" event={"ID":"5d64e4e4-969c-441f-932a-9610f8b815d0","Type":"ContainerDied","Data":"bd16bfd44fbf9cc5473e81fec1c234b6a07ff77abe2deedced51a3751ea5dc46"} Apr 17 08:15:44.606641 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.606635 2560 scope.go:117] "RemoveContainer" containerID="3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae" Apr 17 08:15:44.614548 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.614530 2560 scope.go:117] "RemoveContainer" containerID="3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae" Apr 17 08:15:44.614845 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:15:44.614819 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae\": container with ID starting with 3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae not found: ID does not exist" containerID="3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae" Apr 17 08:15:44.614917 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.614857 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae"} err="failed to get container status \"3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae\": rpc error: code = NotFound desc = could not find container \"3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae\": container with ID starting with 3a5075239c88d147459f5c85570e8a372b84640c199902936c929702f6ba83ae not found: ID does not exist" Apr 17 08:15:44.628074 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.628054 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t"] Apr 17 08:15:44.632403 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:44.632378 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-375fe-5b588c76f8-lm94t"] Apr 17 08:15:46.133684 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:46.133649 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" path="/var/lib/kubelet/pods/5d64e4e4-969c-441f-932a-9610f8b815d0/volumes" Apr 17 08:15:48.674890 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.674858 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8"] Apr 17 08:15:48.675252 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.675181 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" Apr 17 08:15:48.675252 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.675194 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" Apr 17 08:15:48.675252 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.675207 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" Apr 17 08:15:48.675252 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.675213 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" Apr 17 08:15:48.675380 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.675269 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="7acd7c46-18d3-4fa4-9697-1f83492ffd81" containerName="ensemble-graph-390bf" Apr 17 08:15:48.675380 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.675277 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d64e4e4-969c-441f-932a-9610f8b815d0" containerName="sequence-graph-375fe" Apr 17 08:15:48.678229 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.678212 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.680474 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.680449 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-65d2e-kube-rbac-proxy-sar-config\"" Apr 17 08:15:48.680588 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.680556 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:15:48.681409 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.681388 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-65d2e-serving-cert\"" Apr 17 08:15:48.681517 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.681408 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:15:48.685488 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.685295 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8"] Apr 17 08:15:48.817061 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.817019 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-proxy-tls\") pod \"ensemble-graph-65d2e-5d4794777-nd7j8\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.817061 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.817063 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-openshift-service-ca-bundle\") pod \"ensemble-graph-65d2e-5d4794777-nd7j8\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.917802 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.917765 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-proxy-tls\") pod \"ensemble-graph-65d2e-5d4794777-nd7j8\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.918002 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.917816 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-openshift-service-ca-bundle\") pod \"ensemble-graph-65d2e-5d4794777-nd7j8\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.918431 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.918404 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-openshift-service-ca-bundle\") pod \"ensemble-graph-65d2e-5d4794777-nd7j8\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.920171 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.920151 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-proxy-tls\") pod \"ensemble-graph-65d2e-5d4794777-nd7j8\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:48.989894 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:48.989802 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:49.108710 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:49.108678 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8"] Apr 17 08:15:49.111800 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:15:49.111772 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc33d4c4_cc9f_43d5_8c27_3bee24ca8346.slice/crio-ea1315426a7b534eb2a3e8d5be18754e22e00b3cd0536f9e5c072e17efc65e81 WatchSource:0}: Error finding container ea1315426a7b534eb2a3e8d5be18754e22e00b3cd0536f9e5c072e17efc65e81: Status 404 returned error can't find the container with id ea1315426a7b534eb2a3e8d5be18754e22e00b3cd0536f9e5c072e17efc65e81 Apr 17 08:15:49.625344 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:49.625310 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" event={"ID":"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346","Type":"ContainerStarted","Data":"998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d"} Apr 17 08:15:49.625531 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:49.625350 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" event={"ID":"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346","Type":"ContainerStarted","Data":"ea1315426a7b534eb2a3e8d5be18754e22e00b3cd0536f9e5c072e17efc65e81"} Apr 17 08:15:49.625531 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:49.625447 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:15:49.640798 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:49.640745 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podStartSLOduration=1.640728181 podStartE2EDuration="1.640728181s" podCreationTimestamp="2026-04-17 08:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:15:49.639789521 +0000 UTC m=+1408.126911324" watchObservedRunningTime="2026-04-17 08:15:49.640728181 +0000 UTC m=+1408.127849985" Apr 17 08:15:55.635045 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:15:55.635012 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:16:24.445367 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.445328 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f"] Apr 17 08:16:24.449829 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.449807 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:24.452119 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.452098 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-60721-serving-cert\"" Apr 17 08:16:24.452238 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.452102 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-60721-kube-rbac-proxy-sar-config\"" Apr 17 08:16:24.454926 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.454904 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f"] Apr 17 08:16:24.514135 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.514102 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-openshift-service-ca-bundle\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:24.514135 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.514138 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:24.614710 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.614670 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-openshift-service-ca-bundle\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:24.614710 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.614713 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:24.614965 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:16:24.614879 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-60721-serving-cert: secret "sequence-graph-60721-serving-cert" not found Apr 17 08:16:24.614965 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:16:24.614934 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls podName:ea0bfedf-ed45-48b2-80eb-2c4800b08d07 nodeName:}" failed. No retries permitted until 2026-04-17 08:16:25.11491591 +0000 UTC m=+1443.602037692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls") pod "sequence-graph-60721-6999cb44f9-wjm7f" (UID: "ea0bfedf-ed45-48b2-80eb-2c4800b08d07") : secret "sequence-graph-60721-serving-cert" not found Apr 17 08:16:24.615420 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:24.615396 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-openshift-service-ca-bundle\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:25.118768 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.118717 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:25.121218 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.121185 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls\") pod \"sequence-graph-60721-6999cb44f9-wjm7f\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:25.360980 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.360944 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:25.484465 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.484440 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f"] Apr 17 08:16:25.486878 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:16:25.486851 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0bfedf_ed45_48b2_80eb_2c4800b08d07.slice/crio-de71c1cfb9f5b8e5be4f6db1097160b1dd78a1bae1329fec0534bb3d64ae0c39 WatchSource:0}: Error finding container de71c1cfb9f5b8e5be4f6db1097160b1dd78a1bae1329fec0534bb3d64ae0c39: Status 404 returned error can't find the container with id de71c1cfb9f5b8e5be4f6db1097160b1dd78a1bae1329fec0534bb3d64ae0c39 Apr 17 08:16:25.746808 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.746712 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" event={"ID":"ea0bfedf-ed45-48b2-80eb-2c4800b08d07","Type":"ContainerStarted","Data":"328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7"} Apr 17 08:16:25.746808 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.746753 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" event={"ID":"ea0bfedf-ed45-48b2-80eb-2c4800b08d07","Type":"ContainerStarted","Data":"de71c1cfb9f5b8e5be4f6db1097160b1dd78a1bae1329fec0534bb3d64ae0c39"} Apr 17 08:16:25.747057 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.746884 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:16:25.762682 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:25.762637 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podStartSLOduration=1.762624414 podStartE2EDuration="1.762624414s" podCreationTimestamp="2026-04-17 08:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:16:25.760758221 +0000 UTC m=+1444.247880035" watchObservedRunningTime="2026-04-17 08:16:25.762624414 +0000 UTC m=+1444.249746195" Apr 17 08:16:31.755438 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:16:31.755408 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:24:03.321766 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:03.321691 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8"] Apr 17 08:24:03.324128 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:03.321939 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" containerID="cri-o://998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d" gracePeriod=30 Apr 17 08:24:05.633202 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:05.633159 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:10.632943 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:10.632896 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:15.632886 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:15.632840 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:15.633361 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:15.633014 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:24:20.633055 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:20.633015 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:25.633218 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:25.633173 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:30.633352 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:30.633298 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:33.460098 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.460073 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:24:33.548420 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.548389 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-openshift-service-ca-bundle\") pod \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " Apr 17 08:24:33.548615 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.548462 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-proxy-tls\") pod \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\" (UID: \"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346\") " Apr 17 08:24:33.548848 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.548822 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" (UID: "dc33d4c4-cc9f-43d5-8c27-3bee24ca8346"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:24:33.550511 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.550491 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" (UID: "dc33d4c4-cc9f-43d5-8c27-3bee24ca8346"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:24:33.649431 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.649348 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:24:33.649431 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:33.649377 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:24:34.322338 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.322298 2560 generic.go:358] "Generic (PLEG): container finished" podID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerID="998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d" exitCode=0 Apr 17 08:24:34.322614 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.322380 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" event={"ID":"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346","Type":"ContainerDied","Data":"998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d"} Apr 17 08:24:34.322614 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.322395 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" Apr 17 08:24:34.322614 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.322418 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8" event={"ID":"dc33d4c4-cc9f-43d5-8c27-3bee24ca8346","Type":"ContainerDied","Data":"ea1315426a7b534eb2a3e8d5be18754e22e00b3cd0536f9e5c072e17efc65e81"} Apr 17 08:24:34.322614 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.322433 2560 scope.go:117] "RemoveContainer" containerID="998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d" Apr 17 08:24:34.332224 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.332026 2560 scope.go:117] "RemoveContainer" containerID="998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d" Apr 17 08:24:34.332312 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:24:34.332294 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d\": container with ID starting with 998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d not found: ID does not exist" containerID="998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d" Apr 17 08:24:34.332360 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.332321 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d"} err="failed to get container status \"998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d\": rpc error: code = NotFound desc = could not find container \"998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d\": container with ID starting with 998838b1322baa7d357bf8a1d6543c85c20e7f247df0c7d8312701e8b0dca31d not found: ID does not exist" Apr 17 08:24:34.339994 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.339963 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8"] Apr 17 08:24:34.345484 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:34.345463 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-65d2e-5d4794777-nd7j8"] Apr 17 08:24:36.132855 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:36.132818 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" path="/var/lib/kubelet/pods/dc33d4c4-cc9f-43d5-8c27-3bee24ca8346/volumes" Apr 17 08:24:39.119077 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:39.119048 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f"] Apr 17 08:24:39.119452 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:39.119277 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" containerID="cri-o://328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7" gracePeriod=30 Apr 17 08:24:41.753422 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:41.753383 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:46.753844 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:46.753804 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:51.753895 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:51.753844 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:24:51.754315 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:51.753978 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:24:56.753937 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:24:56.753889 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:01.753673 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:01.753632 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:06.753801 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:06.753750 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:09.256926 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.256899 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:25:09.342361 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.342330 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-openshift-service-ca-bundle\") pod \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " Apr 17 08:25:09.342512 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.342370 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls\") pod \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\" (UID: \"ea0bfedf-ed45-48b2-80eb-2c4800b08d07\") " Apr 17 08:25:09.342695 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.342668 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ea0bfedf-ed45-48b2-80eb-2c4800b08d07" (UID: "ea0bfedf-ed45-48b2-80eb-2c4800b08d07"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:25:09.344232 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.344208 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea0bfedf-ed45-48b2-80eb-2c4800b08d07" (UID: "ea0bfedf-ed45-48b2-80eb-2c4800b08d07"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:25:09.434510 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.434423 2560 generic.go:358] "Generic (PLEG): container finished" podID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerID="328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7" exitCode=0 Apr 17 08:25:09.434510 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.434482 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" Apr 17 08:25:09.434687 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.434510 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" event={"ID":"ea0bfedf-ed45-48b2-80eb-2c4800b08d07","Type":"ContainerDied","Data":"328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7"} Apr 17 08:25:09.434687 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.434548 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f" event={"ID":"ea0bfedf-ed45-48b2-80eb-2c4800b08d07","Type":"ContainerDied","Data":"de71c1cfb9f5b8e5be4f6db1097160b1dd78a1bae1329fec0534bb3d64ae0c39"} Apr 17 08:25:09.434687 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.434568 2560 scope.go:117] "RemoveContainer" containerID="328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7" Apr 17 08:25:09.443644 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.443608 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:25:09.443768 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.443648 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea0bfedf-ed45-48b2-80eb-2c4800b08d07-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:25:09.448540 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.448522 2560 scope.go:117] "RemoveContainer" containerID="328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7" Apr 17 08:25:09.448774 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:25:09.448751 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7\": container with ID starting with 328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7 not found: ID does not exist" containerID="328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7" Apr 17 08:25:09.448841 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.448782 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7"} err="failed to get container status \"328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7\": rpc error: code = NotFound desc = could not find container \"328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7\": container with ID starting with 328d6206f2dbe115f79334929d6e88207dcebef97951b0187de6788da22e6ba7 not found: ID does not exist" Apr 17 08:25:09.457400 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.457378 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f"] Apr 17 08:25:09.460771 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:09.460749 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-60721-6999cb44f9-wjm7f"] Apr 17 08:25:10.134349 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:10.134313 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" path="/var/lib/kubelet/pods/ea0bfedf-ed45-48b2-80eb-2c4800b08d07/volumes" Apr 17 08:25:13.579897 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.579865 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll"] Apr 17 08:25:13.580357 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.580337 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" Apr 17 08:25:13.580357 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.580357 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" Apr 17 08:25:13.580466 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.580370 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" Apr 17 08:25:13.580466 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.580379 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" Apr 17 08:25:13.580567 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.580500 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc33d4c4-cc9f-43d5-8c27-3bee24ca8346" containerName="ensemble-graph-65d2e" Apr 17 08:25:13.580567 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.580514 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea0bfedf-ed45-48b2-80eb-2c4800b08d07" containerName="sequence-graph-60721" Apr 17 08:25:13.584826 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.584807 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:13.587253 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.587227 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-00a8c-kube-rbac-proxy-sar-config\"" Apr 17 08:25:13.587381 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.587254 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:25:13.587381 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.587227 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-00a8c-serving-cert\"" Apr 17 08:25:13.588144 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.588130 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:25:13.591228 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.591205 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll"] Apr 17 08:25:13.681752 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.681722 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/556f95b4-0d29-4c0c-bec8-15fac50921c7-openshift-service-ca-bundle\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:13.681918 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.681773 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:13.783097 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.783052 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/556f95b4-0d29-4c0c-bec8-15fac50921c7-openshift-service-ca-bundle\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:13.783281 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.783132 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:13.783336 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:25:13.783273 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-00a8c-serving-cert: secret "splitter-graph-00a8c-serving-cert" not found Apr 17 08:25:13.783374 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:25:13.783340 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls podName:556f95b4-0d29-4c0c-bec8-15fac50921c7 nodeName:}" failed. No retries permitted until 2026-04-17 08:25:14.283323456 +0000 UTC m=+1972.770445241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls") pod "splitter-graph-00a8c-7d99f4c66f-pdqll" (UID: "556f95b4-0d29-4c0c-bec8-15fac50921c7") : secret "splitter-graph-00a8c-serving-cert" not found Apr 17 08:25:13.783706 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:13.783688 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/556f95b4-0d29-4c0c-bec8-15fac50921c7-openshift-service-ca-bundle\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:14.288439 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:14.288399 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:14.290769 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:14.290748 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls\") pod \"splitter-graph-00a8c-7d99f4c66f-pdqll\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:14.495138 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:14.495092 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:14.610768 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:14.610745 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll"] Apr 17 08:25:14.612903 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:25:14.612873 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556f95b4_0d29_4c0c_bec8_15fac50921c7.slice/crio-4f0cd148ea633a7f335962cdb2505070aaed94a70847ddf7689544e179eb6c1c WatchSource:0}: Error finding container 4f0cd148ea633a7f335962cdb2505070aaed94a70847ddf7689544e179eb6c1c: Status 404 returned error can't find the container with id 4f0cd148ea633a7f335962cdb2505070aaed94a70847ddf7689544e179eb6c1c Apr 17 08:25:14.614769 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:14.614749 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:25:15.456338 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:15.456300 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" event={"ID":"556f95b4-0d29-4c0c-bec8-15fac50921c7","Type":"ContainerStarted","Data":"af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14"} Apr 17 08:25:15.456338 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:15.456340 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" event={"ID":"556f95b4-0d29-4c0c-bec8-15fac50921c7","Type":"ContainerStarted","Data":"4f0cd148ea633a7f335962cdb2505070aaed94a70847ddf7689544e179eb6c1c"} Apr 17 08:25:15.456541 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:15.456388 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:15.470634 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:15.470590 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podStartSLOduration=2.470576222 podStartE2EDuration="2.470576222s" podCreationTimestamp="2026-04-17 08:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:25:15.469980289 +0000 UTC m=+1973.957102094" watchObservedRunningTime="2026-04-17 08:25:15.470576222 +0000 UTC m=+1973.957698025" Apr 17 08:25:21.464259 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:21.464232 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:23.639460 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:23.639428 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll"] Apr 17 08:25:23.639812 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:23.639648 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" containerID="cri-o://af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14" gracePeriod=30 Apr 17 08:25:26.463756 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:26.463720 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:31.463401 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:31.463352 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:36.464553 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:36.464503 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:36.465023 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:36.464624 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:41.463429 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:41.463385 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:46.463565 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:46.463519 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:49.403707 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.403667 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l"] Apr 17 08:25:49.408623 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.408601 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:49.410835 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.410812 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-54cf7-serving-cert\"" Apr 17 08:25:49.410946 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.410813 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-54cf7-kube-rbac-proxy-sar-config\"" Apr 17 08:25:49.413464 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.413445 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l"] Apr 17 08:25:49.447323 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.447294 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027375f1-69ee-41d8-9a78-9c7da61f7450-openshift-service-ca-bundle\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:49.447425 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.447394 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:49.548167 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.548134 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:49.548321 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.548182 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027375f1-69ee-41d8-9a78-9c7da61f7450-openshift-service-ca-bundle\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:49.548321 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:25:49.548295 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-54cf7-serving-cert: secret "switch-graph-54cf7-serving-cert" not found Apr 17 08:25:49.548409 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:25:49.548383 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls podName:027375f1-69ee-41d8-9a78-9c7da61f7450 nodeName:}" failed. No retries permitted until 2026-04-17 08:25:50.048365995 +0000 UTC m=+2008.535487776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls") pod "switch-graph-54cf7-669b44b5bb-25b6l" (UID: "027375f1-69ee-41d8-9a78-9c7da61f7450") : secret "switch-graph-54cf7-serving-cert" not found Apr 17 08:25:49.548804 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:49.548787 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027375f1-69ee-41d8-9a78-9c7da61f7450-openshift-service-ca-bundle\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:50.052263 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.052229 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:50.054609 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.054580 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls\") pod \"switch-graph-54cf7-669b44b5bb-25b6l\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:50.319605 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.319573 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:50.442201 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.442177 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l"] Apr 17 08:25:50.444586 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:25:50.444560 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027375f1_69ee_41d8_9a78_9c7da61f7450.slice/crio-76c55acdd8fa77dd4cfa9d5f8860aae7162e4a58ac98a83ae429eb25638f1a02 WatchSource:0}: Error finding container 76c55acdd8fa77dd4cfa9d5f8860aae7162e4a58ac98a83ae429eb25638f1a02: Status 404 returned error can't find the container with id 76c55acdd8fa77dd4cfa9d5f8860aae7162e4a58ac98a83ae429eb25638f1a02 Apr 17 08:25:50.569190 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.569147 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" event={"ID":"027375f1-69ee-41d8-9a78-9c7da61f7450","Type":"ContainerStarted","Data":"93beab7426bb77e6af8f4340dc2e85497062a5d14fe957d6c46d6b1e4275531b"} Apr 17 08:25:50.569190 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.569183 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" event={"ID":"027375f1-69ee-41d8-9a78-9c7da61f7450","Type":"ContainerStarted","Data":"76c55acdd8fa77dd4cfa9d5f8860aae7162e4a58ac98a83ae429eb25638f1a02"} Apr 17 08:25:50.569386 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.569282 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:25:50.584881 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:50.584802 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podStartSLOduration=1.5847905450000002 podStartE2EDuration="1.584790545s" podCreationTimestamp="2026-04-17 08:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:25:50.584167336 +0000 UTC m=+2009.071289139" watchObservedRunningTime="2026-04-17 08:25:50.584790545 +0000 UTC m=+2009.071912347" Apr 17 08:25:51.463489 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:51.463452 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:25:53.776059 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.776038 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:53.886136 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.886098 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls\") pod \"556f95b4-0d29-4c0c-bec8-15fac50921c7\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " Apr 17 08:25:53.886290 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.886243 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/556f95b4-0d29-4c0c-bec8-15fac50921c7-openshift-service-ca-bundle\") pod \"556f95b4-0d29-4c0c-bec8-15fac50921c7\" (UID: \"556f95b4-0d29-4c0c-bec8-15fac50921c7\") " Apr 17 08:25:53.886564 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.886535 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556f95b4-0d29-4c0c-bec8-15fac50921c7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "556f95b4-0d29-4c0c-bec8-15fac50921c7" (UID: "556f95b4-0d29-4c0c-bec8-15fac50921c7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:25:53.888096 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.888074 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "556f95b4-0d29-4c0c-bec8-15fac50921c7" (UID: "556f95b4-0d29-4c0c-bec8-15fac50921c7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:25:53.986836 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.986749 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/556f95b4-0d29-4c0c-bec8-15fac50921c7-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:25:53.986836 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:53.986780 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/556f95b4-0d29-4c0c-bec8-15fac50921c7-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:25:54.587807 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.587772 2560 generic.go:358] "Generic (PLEG): container finished" podID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerID="af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14" exitCode=0 Apr 17 08:25:54.587952 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.587835 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" Apr 17 08:25:54.587952 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.587838 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" event={"ID":"556f95b4-0d29-4c0c-bec8-15fac50921c7","Type":"ContainerDied","Data":"af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14"} Apr 17 08:25:54.587952 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.587946 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll" event={"ID":"556f95b4-0d29-4c0c-bec8-15fac50921c7","Type":"ContainerDied","Data":"4f0cd148ea633a7f335962cdb2505070aaed94a70847ddf7689544e179eb6c1c"} Apr 17 08:25:54.588116 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.587961 2560 scope.go:117] "RemoveContainer" containerID="af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14" Apr 17 08:25:54.595762 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.595736 2560 scope.go:117] "RemoveContainer" containerID="af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14" Apr 17 08:25:54.596013 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:25:54.595979 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14\": container with ID starting with af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14 not found: ID does not exist" containerID="af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14" Apr 17 08:25:54.596061 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.596026 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14"} err="failed to get container status \"af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14\": rpc error: code = NotFound desc = could not find container \"af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14\": container with ID starting with af5dbdc8978b0f15f3db83fee78e4093b0ee3ab814c4f57210af7cf55e44fd14 not found: ID does not exist" Apr 17 08:25:54.602867 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.602845 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll"] Apr 17 08:25:54.606593 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:54.606574 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-00a8c-7d99f4c66f-pdqll"] Apr 17 08:25:56.132979 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:56.132943 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" path="/var/lib/kubelet/pods/556f95b4-0d29-4c0c-bec8-15fac50921c7/volumes" Apr 17 08:25:56.577682 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:25:56.577654 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:26:33.868316 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.868284 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb"] Apr 17 08:26:33.868772 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.868601 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" Apr 17 08:26:33.868772 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.868612 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" Apr 17 08:26:33.868772 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.868676 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="556f95b4-0d29-4c0c-bec8-15fac50921c7" containerName="splitter-graph-00a8c" Apr 17 08:26:33.871838 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.871820 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:33.874219 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.874200 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-32e35-serving-cert\"" Apr 17 08:26:33.874324 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.874208 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-32e35-kube-rbac-proxy-sar-config\"" Apr 17 08:26:33.877532 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.877507 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb"] Apr 17 08:26:33.906762 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.906729 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316b0a75-6515-4cdf-966d-b5d52752202d-openshift-service-ca-bundle\") pod \"splitter-graph-32e35-6c6f6d485b-vhhcb\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:33.906903 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:33.906826 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/316b0a75-6515-4cdf-966d-b5d52752202d-proxy-tls\") pod \"splitter-graph-32e35-6c6f6d485b-vhhcb\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.007727 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.007692 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/316b0a75-6515-4cdf-966d-b5d52752202d-proxy-tls\") pod \"splitter-graph-32e35-6c6f6d485b-vhhcb\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.007876 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.007780 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316b0a75-6515-4cdf-966d-b5d52752202d-openshift-service-ca-bundle\") pod \"splitter-graph-32e35-6c6f6d485b-vhhcb\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.008477 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.008458 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316b0a75-6515-4cdf-966d-b5d52752202d-openshift-service-ca-bundle\") pod \"splitter-graph-32e35-6c6f6d485b-vhhcb\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.010161 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.010143 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/316b0a75-6515-4cdf-966d-b5d52752202d-proxy-tls\") pod \"splitter-graph-32e35-6c6f6d485b-vhhcb\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.183220 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.183127 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.503328 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.503202 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb"] Apr 17 08:26:34.506067 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:26:34.506038 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316b0a75_6515_4cdf_966d_b5d52752202d.slice/crio-2f6a7dd0960501dd630a685c6ab8312f6ed7528e894512df1c3c8b3bb491563d WatchSource:0}: Error finding container 2f6a7dd0960501dd630a685c6ab8312f6ed7528e894512df1c3c8b3bb491563d: Status 404 returned error can't find the container with id 2f6a7dd0960501dd630a685c6ab8312f6ed7528e894512df1c3c8b3bb491563d Apr 17 08:26:34.720339 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.720301 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" event={"ID":"316b0a75-6515-4cdf-966d-b5d52752202d","Type":"ContainerStarted","Data":"68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86"} Apr 17 08:26:34.720339 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.720341 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" event={"ID":"316b0a75-6515-4cdf-966d-b5d52752202d","Type":"ContainerStarted","Data":"2f6a7dd0960501dd630a685c6ab8312f6ed7528e894512df1c3c8b3bb491563d"} Apr 17 08:26:34.720557 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.720428 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:26:34.736303 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:34.736235 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podStartSLOduration=1.7362215829999998 podStartE2EDuration="1.736221583s" podCreationTimestamp="2026-04-17 08:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:26:34.735041388 +0000 UTC m=+2053.222163190" watchObservedRunningTime="2026-04-17 08:26:34.736221583 +0000 UTC m=+2053.223343422" Apr 17 08:26:40.730458 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:26:40.730380 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:34:48.542538 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:34:48.542457 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb"] Apr 17 08:34:48.544936 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:34:48.542766 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" containerID="cri-o://68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86" gracePeriod=30 Apr 17 08:34:50.728030 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:34:50.727965 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:34:55.727936 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:34:55.727896 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:35:00.727798 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:00.727754 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:35:00.728187 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:00.727849 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:35:05.728555 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:05.728506 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:35:10.727887 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:10.727849 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:35:15.727532 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:15.727485 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:35:18.678716 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.678692 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:35:18.725596 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.725568 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/316b0a75-6515-4cdf-966d-b5d52752202d-proxy-tls\") pod \"316b0a75-6515-4cdf-966d-b5d52752202d\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " Apr 17 08:35:18.725752 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.725616 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316b0a75-6515-4cdf-966d-b5d52752202d-openshift-service-ca-bundle\") pod \"316b0a75-6515-4cdf-966d-b5d52752202d\" (UID: \"316b0a75-6515-4cdf-966d-b5d52752202d\") " Apr 17 08:35:18.726010 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.725951 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316b0a75-6515-4cdf-966d-b5d52752202d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "316b0a75-6515-4cdf-966d-b5d52752202d" (UID: "316b0a75-6515-4cdf-966d-b5d52752202d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:35:18.727503 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.727482 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316b0a75-6515-4cdf-966d-b5d52752202d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "316b0a75-6515-4cdf-966d-b5d52752202d" (UID: "316b0a75-6515-4cdf-966d-b5d52752202d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:35:18.827081 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.827051 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/316b0a75-6515-4cdf-966d-b5d52752202d-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:35:18.827081 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:18.827080 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316b0a75-6515-4cdf-966d-b5d52752202d-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:35:19.363848 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.363813 2560 generic.go:358] "Generic (PLEG): container finished" podID="316b0a75-6515-4cdf-966d-b5d52752202d" containerID="68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86" exitCode=0 Apr 17 08:35:19.364023 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.363858 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" event={"ID":"316b0a75-6515-4cdf-966d-b5d52752202d","Type":"ContainerDied","Data":"68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86"} Apr 17 08:35:19.364023 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.363870 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" Apr 17 08:35:19.364023 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.363893 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb" event={"ID":"316b0a75-6515-4cdf-966d-b5d52752202d","Type":"ContainerDied","Data":"2f6a7dd0960501dd630a685c6ab8312f6ed7528e894512df1c3c8b3bb491563d"} Apr 17 08:35:19.364023 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.363913 2560 scope.go:117] "RemoveContainer" containerID="68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86" Apr 17 08:35:19.371743 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.371721 2560 scope.go:117] "RemoveContainer" containerID="68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86" Apr 17 08:35:19.372038 ip-10-0-141-224 kubenswrapper[2560]: E0417 08:35:19.372010 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86\": container with ID starting with 68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86 not found: ID does not exist" containerID="68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86" Apr 17 08:35:19.372096 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.372049 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86"} err="failed to get container status \"68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86\": rpc error: code = NotFound desc = could not find container \"68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86\": container with ID starting with 68878f77f6c1e46bec8d5a875f068f4b15950a4b314ba98dc2b04b5b5dbdfb86 not found: ID does not exist" Apr 17 08:35:19.384170 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.384147 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb"] Apr 17 08:35:19.386809 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:19.386787 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-32e35-6c6f6d485b-vhhcb"] Apr 17 08:35:20.132946 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:35:20.132912 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" path="/var/lib/kubelet/pods/316b0a75-6515-4cdf-966d-b5d52752202d/volumes" Apr 17 08:42:08.700128 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:08.700010 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l"] Apr 17 08:42:08.702722 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:08.700202 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" containerID="cri-o://93beab7426bb77e6af8f4340dc2e85497062a5d14fe957d6c46d6b1e4275531b" gracePeriod=30 Apr 17 08:42:11.576589 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:11.576545 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:42:16.576514 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:16.576472 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:42:21.576322 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:21.576283 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:42:21.576695 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:21.576392 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:42:23.643679 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:23.643641 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:24.348587 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:24.348544 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:25.058961 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:25.058927 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:25.753595 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:25.753563 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:26.435857 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:26.435818 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:26.576607 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:26.576565 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:42:27.114031 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:27.113997 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:27.792861 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:27.792832 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:28.490701 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:28.490662 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:29.187890 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:29.187859 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:29.881574 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:29.881547 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:30.551291 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:30.551251 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:31.221733 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:31.221692 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-54cf7-669b44b5bb-25b6l_027375f1-69ee-41d8-9a78-9c7da61f7450/switch-graph-54cf7/0.log" Apr 17 08:42:31.575919 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:31.575880 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:42:36.577008 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:36.576947 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:42:37.409882 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:37.409837 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8d5w9_0cebed97-5a79-4daf-8b36-7fb10b919eaa/global-pull-secret-syncer/0.log" Apr 17 08:42:37.580067 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:37.580035 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-65l87_59fbd464-53fc-455b-9630-7e429d74587e/konnectivity-agent/0.log" Apr 17 08:42:37.701810 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:37.701721 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-224.ec2.internal_2d05103e605d22dd69e84834218ff183/haproxy/0.log" Apr 17 08:42:38.758101 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.758063 2560 generic.go:358] "Generic (PLEG): container finished" podID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerID="93beab7426bb77e6af8f4340dc2e85497062a5d14fe957d6c46d6b1e4275531b" exitCode=0 Apr 17 08:42:38.758446 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.758139 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" event={"ID":"027375f1-69ee-41d8-9a78-9c7da61f7450","Type":"ContainerDied","Data":"93beab7426bb77e6af8f4340dc2e85497062a5d14fe957d6c46d6b1e4275531b"} Apr 17 08:42:38.843636 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.843616 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:42:38.933726 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.933693 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls\") pod \"027375f1-69ee-41d8-9a78-9c7da61f7450\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " Apr 17 08:42:38.933726 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.933732 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027375f1-69ee-41d8-9a78-9c7da61f7450-openshift-service-ca-bundle\") pod \"027375f1-69ee-41d8-9a78-9c7da61f7450\" (UID: \"027375f1-69ee-41d8-9a78-9c7da61f7450\") " Apr 17 08:42:38.934161 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.934136 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027375f1-69ee-41d8-9a78-9c7da61f7450-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "027375f1-69ee-41d8-9a78-9c7da61f7450" (UID: "027375f1-69ee-41d8-9a78-9c7da61f7450"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:42:38.935692 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:38.935669 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "027375f1-69ee-41d8-9a78-9c7da61f7450" (UID: "027375f1-69ee-41d8-9a78-9c7da61f7450"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:42:39.034420 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.034332 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/027375f1-69ee-41d8-9a78-9c7da61f7450-proxy-tls\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:42:39.034420 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.034367 2560 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027375f1-69ee-41d8-9a78-9c7da61f7450-openshift-service-ca-bundle\") on node \"ip-10-0-141-224.ec2.internal\" DevicePath \"\"" Apr 17 08:42:39.767926 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.767896 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" Apr 17 08:42:39.768375 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.767895 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l" event={"ID":"027375f1-69ee-41d8-9a78-9c7da61f7450","Type":"ContainerDied","Data":"76c55acdd8fa77dd4cfa9d5f8860aae7162e4a58ac98a83ae429eb25638f1a02"} Apr 17 08:42:39.768375 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.768025 2560 scope.go:117] "RemoveContainer" containerID="93beab7426bb77e6af8f4340dc2e85497062a5d14fe957d6c46d6b1e4275531b" Apr 17 08:42:39.788199 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.788165 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l"] Apr 17 08:42:39.791036 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:39.791011 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-54cf7-669b44b5bb-25b6l"] Apr 17 08:42:40.132728 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.132683 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" path="/var/lib/kubelet/pods/027375f1-69ee-41d8-9a78-9c7da61f7450/volumes" Apr 17 08:42:40.821154 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.821128 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/alertmanager/0.log" Apr 17 08:42:40.848612 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.848580 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/config-reloader/0.log" Apr 17 08:42:40.873431 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.873402 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/kube-rbac-proxy-web/0.log" Apr 17 08:42:40.905535 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.905506 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/kube-rbac-proxy/0.log" Apr 17 08:42:40.928972 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.928945 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/kube-rbac-proxy-metric/0.log" Apr 17 08:42:40.953717 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.953680 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/prom-label-proxy/0.log" Apr 17 08:42:40.976293 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:40.976269 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f3b69634-5c7f-46df-9cfe-1c3746d89b86/init-config-reloader/0.log" Apr 17 08:42:41.077918 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:41.077827 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-b57gm_26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f/kube-state-metrics/0.log" Apr 17 08:42:41.098087 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:41.098034 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-b57gm_26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f/kube-rbac-proxy-main/0.log" Apr 17 08:42:41.119840 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:41.119812 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-b57gm_26cf8eb2-838b-43fd-8fd1-d6b3cf466b8f/kube-rbac-proxy-self/0.log" Apr 17 08:42:41.301602 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:41.301571 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4cwcz_749cabee-7543-4558-9e74-fbd5becf3299/node-exporter/0.log" Apr 17 08:42:41.336372 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:41.336296 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4cwcz_749cabee-7543-4558-9e74-fbd5becf3299/kube-rbac-proxy/0.log" Apr 17 08:42:41.367240 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:41.367214 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4cwcz_749cabee-7543-4558-9e74-fbd5becf3299/init-textfile/0.log" Apr 17 08:42:42.099860 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:42.099831 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586dd88f6-tz2xg_aa20a5ef-7210-40b0-9d12-74a82d28d52a/telemeter-client/0.log" Apr 17 08:42:42.137773 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:42.137745 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586dd88f6-tz2xg_aa20a5ef-7210-40b0-9d12-74a82d28d52a/reload/0.log" Apr 17 08:42:42.182325 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:42.182286 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-586dd88f6-tz2xg_aa20a5ef-7210-40b0-9d12-74a82d28d52a/kube-rbac-proxy/0.log" Apr 17 08:42:44.557013 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.556966 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498"] Apr 17 08:42:44.557438 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.557421 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" Apr 17 08:42:44.557483 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.557441 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" Apr 17 08:42:44.557483 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.557461 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" Apr 17 08:42:44.557483 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.557469 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" Apr 17 08:42:44.557569 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.557552 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="027375f1-69ee-41d8-9a78-9c7da61f7450" containerName="switch-graph-54cf7" Apr 17 08:42:44.557569 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.557566 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="316b0a75-6515-4cdf-966d-b5d52752202d" containerName="splitter-graph-32e35" Apr 17 08:42:44.561973 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.561953 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.565003 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.564964 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gxzkb\"/\"default-dockercfg-pdhdf\"" Apr 17 08:42:44.565131 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.564976 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gxzkb\"/\"kube-root-ca.crt\"" Apr 17 08:42:44.565299 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.565279 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gxzkb\"/\"openshift-service-ca.crt\"" Apr 17 08:42:44.566015 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.565975 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498"] Apr 17 08:42:44.680966 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.680930 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-sys\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.681136 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.680979 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-lib-modules\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.681136 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.681029 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-254qm\" (UniqueName: \"kubernetes.io/projected/2dcb252d-0953-4713-8792-fed430f43f2a-kube-api-access-254qm\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.681136 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.681088 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-podres\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.681277 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.681138 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-proc\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781522 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781482 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-lib-modules\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781729 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781551 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-254qm\" (UniqueName: \"kubernetes.io/projected/2dcb252d-0953-4713-8792-fed430f43f2a-kube-api-access-254qm\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781729 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781607 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-podres\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781729 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781649 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-proc\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781729 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781665 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-lib-modules\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781729 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781687 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-sys\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781936 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781755 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-sys\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781936 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781767 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-podres\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.781936 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.781767 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2dcb252d-0953-4713-8792-fed430f43f2a-proc\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.788555 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.788531 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-254qm\" (UniqueName: \"kubernetes.io/projected/2dcb252d-0953-4713-8792-fed430f43f2a-kube-api-access-254qm\") pod \"perf-node-gather-daemonset-9p498\" (UID: \"2dcb252d-0953-4713-8792-fed430f43f2a\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.873797 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.873748 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:44.987032 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.986975 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498"] Apr 17 08:42:44.989940 ip-10-0-141-224 kubenswrapper[2560]: W0417 08:42:44.989908 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2dcb252d_0953_4713_8792_fed430f43f2a.slice/crio-c980db4338a8aa625f2eab595ddf3b91d769b3442272aa0cc42eefde851cfe89 WatchSource:0}: Error finding container c980db4338a8aa625f2eab595ddf3b91d769b3442272aa0cc42eefde851cfe89: Status 404 returned error can't find the container with id c980db4338a8aa625f2eab595ddf3b91d769b3442272aa0cc42eefde851cfe89 Apr 17 08:42:44.991582 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:44.991562 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:42:45.193083 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.193005 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4k2tj_00ef5608-cd73-4a1a-ac02-5e5da7727a5f/dns/0.log" Apr 17 08:42:45.215079 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.215051 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4k2tj_00ef5608-cd73-4a1a-ac02-5e5da7727a5f/kube-rbac-proxy/0.log" Apr 17 08:42:45.360745 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.360717 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zh8tn_e87601ff-22f7-4eb6-bb9e-5d78a6b02e12/dns-node-resolver/0.log" Apr 17 08:42:45.740748 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.740717 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5b9ddc8f4-p749g_3ad78d61-0cc9-46c4-84cd-f522e68f9763/registry/0.log" Apr 17 08:42:45.786447 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.786419 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8dpsb_03b7cc9b-71c3-4b06-9c37-a26058521703/node-ca/0.log" Apr 17 08:42:45.788504 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.788473 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" event={"ID":"2dcb252d-0953-4713-8792-fed430f43f2a","Type":"ContainerStarted","Data":"a007bfe531f2d70d97bfa739254b5518fe1e600fb89e9105da45ceb9d673e688"} Apr 17 08:42:45.788504 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.788505 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" event={"ID":"2dcb252d-0953-4713-8792-fed430f43f2a","Type":"ContainerStarted","Data":"c980db4338a8aa625f2eab595ddf3b91d769b3442272aa0cc42eefde851cfe89"} Apr 17 08:42:45.788665 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.788573 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:45.803939 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:45.803897 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" podStartSLOduration=1.803883385 podStartE2EDuration="1.803883385s" podCreationTimestamp="2026-04-17 08:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:42:45.802296821 +0000 UTC m=+3024.289418621" watchObservedRunningTime="2026-04-17 08:42:45.803883385 +0000 UTC m=+3024.291005185" Apr 17 08:42:46.802763 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:46.802733 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mxsbk_f273b17c-4ccf-45b2-93e0-868dd9134101/serve-healthcheck-canary/0.log" Apr 17 08:42:47.197077 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:47.197043 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4j7xs_23614c33-d1d2-4da0-8603-df308834ff05/kube-rbac-proxy/0.log" Apr 17 08:42:47.215869 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:47.215842 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4j7xs_23614c33-d1d2-4da0-8603-df308834ff05/exporter/0.log" Apr 17 08:42:47.235330 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:47.235293 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4j7xs_23614c33-d1d2-4da0-8603-df308834ff05/extractor/0.log" Apr 17 08:42:49.217501 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:49.217464 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2zm6l_b4446917-6447-409d-9c3d-d26288ad810a/manager/0.log" Apr 17 08:42:49.237603 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:49.237577 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-9nkw8_52aaae27-68eb-4dd5-a27b-19d94d278505/server/0.log" Apr 17 08:42:51.801522 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:51.801491 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-9p498" Apr 17 08:42:54.597132 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.597101 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/kube-multus-additional-cni-plugins/0.log" Apr 17 08:42:54.615858 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.615826 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/egress-router-binary-copy/0.log" Apr 17 08:42:54.635501 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.635476 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/cni-plugins/0.log" Apr 17 08:42:54.653903 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.653873 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/bond-cni-plugin/0.log" Apr 17 08:42:54.672847 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.672825 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/routeoverride-cni/0.log" Apr 17 08:42:54.691679 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.691651 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/whereabouts-cni-bincopy/0.log" Apr 17 08:42:54.711692 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:54.711621 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88cn6_475827a7-8f6f-4574-b5a7-05d38afa9444/whereabouts-cni/0.log" Apr 17 08:42:55.048517 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:55.048448 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pqj85_0cbe3752-39a7-4fce-ab72-9683a62aba37/kube-multus/0.log" Apr 17 08:42:55.108918 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:55.108886 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k4vcb_f6ca1d48-95c2-414b-af4e-838843029028/network-metrics-daemon/0.log" Apr 17 08:42:55.126532 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:55.126501 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k4vcb_f6ca1d48-95c2-414b-af4e-838843029028/kube-rbac-proxy/0.log" Apr 17 08:42:56.544203 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.544167 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/ovn-controller/0.log" Apr 17 08:42:56.584897 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.584869 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/ovn-acl-logging/0.log" Apr 17 08:42:56.606049 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.606020 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/kube-rbac-proxy-node/0.log" Apr 17 08:42:56.629750 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.629719 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:42:56.645138 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.645106 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/northd/0.log" Apr 17 08:42:56.664572 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.664544 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/nbdb/0.log" Apr 17 08:42:56.682745 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.682719 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/sbdb/0.log" Apr 17 08:42:56.845811 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:56.845771 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zfq9h_322ec8c6-8646-443d-9065-38a19aa96bd1/ovnkube-controller/0.log" Apr 17 08:42:57.738950 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:57.738912 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vlg7t_a03d58cf-86b6-4ec5-be8c-e346b788c3d6/check-endpoints/0.log" Apr 17 08:42:57.803339 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:57.803300 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vchg7_82c7c47d-33d8-4e71-8695-11aab98b699d/network-check-target-container/0.log" Apr 17 08:42:58.670456 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:58.670424 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vqblm_e9c30bee-b3b0-40b8-8f95-46b04aca3c77/iptables-alerter/0.log" Apr 17 08:42:59.286484 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:42:59.286459 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bpgx2_271684f4-9f94-4d1d-9c77-9fbf3a3219c9/tuned/0.log" Apr 17 08:43:00.939999 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:43:00.939954 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-jczck_3b35dde2-df58-4352-9c06-578074e85124/cluster-samples-operator/0.log" Apr 17 08:43:00.954711 ip-10-0-141-224 kubenswrapper[2560]: I0417 08:43:00.954680 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-jczck_3b35dde2-df58-4352-9c06-578074e85124/cluster-samples-operator-watch/0.log"