Apr 16 20:09:05.354323 ip-10-0-134-158 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:09:05.354333 ip-10-0-134-158 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:09:05.354340 ip-10-0-134-158 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:09:05.354571 ip-10-0-134-158 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:09:15.542952 ip-10-0-134-158 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:09:15.542971 ip-10-0-134-158 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2320059241bf42f5b146fdcdb9e8446a -- Apr 16 20:11:42.701416 ip-10-0-134-158 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:43.142564 ip-10-0-134-158 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:43.142564 ip-10-0-134-158 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:43.142564 ip-10-0-134-158 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:43.142564 ip-10-0-134-158 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:43.142564 ip-10-0-134-158 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:43.146054 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.145964 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:43.148279 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148264 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.148279 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148279 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148284 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148294 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148297 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148301 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148304 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148306 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148309 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148312 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148315 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148317 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148321 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148325 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148329 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148332 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148335 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148337 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148340 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148343 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148345 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.148342 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148348 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148351 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148354 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148357 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148359 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148362 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148364 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148367 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148371 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148375 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148378 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148380 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148383 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148386 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148390 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148393 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148396 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148399 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148402 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148404 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.148824 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148407 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148409 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148413 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148416 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148419 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148421 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148424 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148426 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148429 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148431 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148434 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148436 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148438 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148441 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148444 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148447 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148449 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148452 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148455 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.149395 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148458 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148460 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148463 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148465 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148468 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148470 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148473 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148476 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148479 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148482 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148484 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148486 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148489 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148491 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148494 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148496 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148499 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148502 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148504 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148506 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.149863 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148509 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148511 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148514 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148516 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148518 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.148521 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150330 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150337 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150341 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150344 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150347 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150350 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150352 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150355 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150365 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150368 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150371 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150374 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150377 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.150370 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150379 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150383 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150386 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150388 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150391 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150394 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150397 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150399 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150401 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150404 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150407 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150409 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150412 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150414 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150417 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150421 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150424 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150426 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150429 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.150862 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150432 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150435 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150437 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150439 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150442 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150444 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150447 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150449 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150452 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150454 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150456 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150459 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150461 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150463 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150466 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150469 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150472 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150475 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150479 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150483 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.151363 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150486 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150489 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150491 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150494 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150496 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150499 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150501 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150503 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150506 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150508 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150511 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150513 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150516 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150518 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150521 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150523 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150526 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150528 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150530 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.151867 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150534 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150536 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150539 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150541 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150543 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150546 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150548 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150552 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150555 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150558 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150561 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150563 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150566 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150568 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.150571 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150647 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150654 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150660 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150665 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150669 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150673 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150677 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:43.152357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150682 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150685 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150688 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150692 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150695 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150712 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150716 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150719 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150722 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150725 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150728 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150731 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150734 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150737 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150740 2574 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150743 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150747 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150751 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150755 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150758 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150761 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150765 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150767 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150770 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150774 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:43.152887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150776 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150781 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150784 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150787 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150790 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150793 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150796 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150800 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150803 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150806 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150809 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150812 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150816 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150819 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150822 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150825 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150828 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150830 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150833 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150836 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150839 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150842 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150845 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150849 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150852 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:43.153558 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150856 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150859 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150862 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150866 2574 flags.go:64] FLAG: --help="false" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150869 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150872 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150875 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150878 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150882 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150885 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150888 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150891 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150893 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150896 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150899 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150902 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150904 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150907 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150910 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150913 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150915 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150918 2574 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150921 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150924 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:43.154181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150927 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150932 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150935 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150938 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150941 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150944 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150947 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150950 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150953 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150958 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150961 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150965 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150968 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150971 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150974 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150977 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150980 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150983 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150986 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150993 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150996 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.150999 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151002 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:43.154815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151005 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151011 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151014 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151017 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151020 2574 flags.go:64] FLAG: --port="10250" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151023 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151026 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0df42066ab122bdc1" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151029 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151032 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151035 2574 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151038 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151041 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151045 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151047 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151051 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151055 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151058 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151061 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151064 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151067 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151070 2574 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151073 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151076 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151079 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151082 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151085 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:43.155427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151088 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151091 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151094 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151097 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151100 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151103 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151118 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151121 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151124 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151126 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151132 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151134 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151137 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151141 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151145 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151147 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151150 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151153 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151156 2574 flags.go:64] FLAG: --v="2" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151161 2574 flags.go:64] FLAG: --version="false" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151166 2574 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151170 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.151174 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151265 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151269 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.156097 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151273 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151276 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151279 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151282 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151285 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151288 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151290 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151293 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151296 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151298 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151301 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151303 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151306 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151308 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151311 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151314 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151316 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151318 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151321 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151325 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.156725 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151328 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151331 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151334 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151336 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151339 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151343 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151346 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151348 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151351 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151354 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151357 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151360 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151362 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151365 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151368 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151370 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151373 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151376 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151379 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151381 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.157273 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151384 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151386 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151389 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151391 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151394 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151396 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151399 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151401 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151404 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151407 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151409 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151412 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151415 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151417 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151419 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151422 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151425 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151428 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151430 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.157768 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151432 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151435 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151437 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151440 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151442 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151445 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151447 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151450 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151453 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151455 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151458 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151461 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151464 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151467 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151470 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151473 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151476 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151478 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151480 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.158326 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151483 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.158859 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151485 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.158859 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151493 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.158859 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151496 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.158859 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151498 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.158859 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.151501 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.158859 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.152146 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:43.159892 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.159872 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:43.159928 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.159893 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159944 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159949 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159952 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159955 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159958 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159961 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.159963 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159964 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159967 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159970 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159973 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159976 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159978 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159981 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159983 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159986 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159989 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159991 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159994 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159996 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.159999 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160002 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160005 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160008 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160012 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160015 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160018 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.160255 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160021 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160023 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160027 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160032 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160035 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160038 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160041 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160044 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160047 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160049 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160052 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160055 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160057 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160061 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160064 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160068 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160071 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160074 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160077 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.160781 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160079 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160082 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160085 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160088 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160091 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160093 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160096 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160099 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160102 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160122 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160125 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160127 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160130 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160132 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160135 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160138 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160140 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160143 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160145 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160148 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.161270 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160150 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160153 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160155 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160157 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160160 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160162 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160165 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160168 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160171 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160174 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160177 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160181 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160183 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160186 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160188 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160191 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160193 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160195 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160198 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160201 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.161765 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160203 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.160208 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160304 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160310 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160313 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160316 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160319 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160321 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160324 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160327 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160329 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160332 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160334 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160337 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160339 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160342 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.162309 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160345 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160347 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160349 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160352 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160356 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160360 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160364 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160367 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160371 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160374 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160377 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160380 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160382 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160385 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160388 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160390 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160393 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160395 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160398 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.162717 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160400 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160403 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160405 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160408 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160410 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160413 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160415 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160418 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160420 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160423 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160425 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160427 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160430 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160432 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160435 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160437 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160440 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160443 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160446 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160448 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.163219 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160451 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160453 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160457 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160460 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160462 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160465 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160467 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160470 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160472 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160475 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160477 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160480 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160482 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160485 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160487 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160490 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160493 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160497 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160499 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160502 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.163701 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160504 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160507 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160509 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160512 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160515 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160517 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160519 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160522 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160525 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160527 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160530 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160533 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:43.160535 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.160540 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:43.164298 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.161322 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:43.164753 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.164739 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:43.165872 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.165861 2574 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:43.165985 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.165967 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:43.166037 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.166011 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:43.190804 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.190779 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:43.193259 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.193233 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:43.211259 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.211234 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:43.216911 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.216897 2574 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:43.218122 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.218093 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:43.220700 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.220681 2574 fs.go:135] Filesystem UUIDs: map[0bc991be-fb86-4c45-9997-778a33cc89db:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d5bd8cb6-09e5-42e0-952d-f688f8b1f800:/dev/nvme0n1p4] Apr 16 20:11:43.220757 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.220701 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:43.227715 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.227695 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:43.227797 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.227640 2574 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:43.225253402 +0000 UTC m=+0.400322566 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098520 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26d991140048af5fde990821869e33 SystemUUID:ec26d991-1400-48af-5fde-990821869e33 BootID:23200592-41bf-42f5-b146-fdcdb9e8446a Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:93:96:d8:75:87 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:93:96:d8:75:87 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:4c:b7:aa:c0:30 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:43.227797 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.227775 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:43.228052 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.227915 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:43.231201 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.231172 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:43.231339 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.231203 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:43.231385 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.231349 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:43.231385 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.231358 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:43.231385 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.231371 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:43.232358 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.232348 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:43.233184 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.233174 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:43.233293 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.233284 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:43.236430 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.236420 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:43.236482 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.236432 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:43.236482 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.236445 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:43.236482 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.236455 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:43.236482 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.236464 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:43.237603 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.237592 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:43.237638 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.237610 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:43.240630 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.240609 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:43.242450 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.242437 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:43.243784 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243767 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:43.243784 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243786 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243793 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243798 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243804 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243810 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243816 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243822 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243829 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243835 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243845 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:43.243907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.243854 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:43.244698 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.244688 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:43.244698 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.244699 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:43.248225 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.248211 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:43.248297 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.248249 2574 server.go:1295] "Started kubelet" Apr 16 20:11:43.248374 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.248346 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:43.248889 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.248839 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:43.248959 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.248913 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:43.249244 ip-10-0-134-158 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:43.249989 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.249917 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:43.251169 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.251152 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:43.253815 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.253787 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:43.253940 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.253786 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:43.254453 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.254437 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:43.255216 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.255199 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:43.255757 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.255738 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:43.257507 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257485 2574 factory.go:55] Registering systemd factory Apr 16 20:11:43.257587 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257552 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:43.257648 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.257548 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.257746 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257725 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:43.257921 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257909 2574 factory.go:153] Registering CRI-O factory Apr 16 20:11:43.257921 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257923 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:43.258058 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257965 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:43.258058 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.257987 2574 factory.go:103] Registering Raw factory Apr 16 20:11:43.258058 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.258000 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:43.258335 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.258318 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:43.258335 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.258339 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:43.258467 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.258427 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:43.258467 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.258440 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:43.258561 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.258493 2574 manager.go:319] Starting recovery of all containers Apr 16 20:11:43.261100 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.261057 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:11:43.261992 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.260821 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-158.ec2.internal.18a6ef6112a9cc93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-158.ec2.internal,UID:ip-10-0-134-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-158.ec2.internal,},FirstTimestamp:2026-04-16 20:11:43.248223379 +0000 UTC m=+0.423292538,LastTimestamp:2026-04-16 20:11:43.248223379 +0000 UTC m=+0.423292538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-158.ec2.internal,}" Apr 16 20:11:43.262185 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.262160 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:11:43.270877 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.270850 2574 manager.go:324] Recovery completed Apr 16 20:11:43.273817 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.273797 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f6rn2" Apr 16 20:11:43.275083 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.275066 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.278442 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.278372 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.278442 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.278405 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.278442 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.278416 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.278974 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.278957 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:43.278974 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.278972 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:43.279085 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.278991 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:43.281265 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.281251 2574 policy_none.go:49] "None policy: Start" Apr 16 20:11:43.281324 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.281268 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:43.281324 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.281279 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:43.282369 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.280476 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-158.ec2.internal.18a6ef6114761513 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-158.ec2.internal,UID:ip-10-0-134-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-158.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-158.ec2.internal,},FirstTimestamp:2026-04-16 20:11:43.278388499 +0000 UTC m=+0.453457661,LastTimestamp:2026-04-16 20:11:43.278388499 +0000 UTC m=+0.453457661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-158.ec2.internal,}" Apr 16 20:11:43.284704 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.284686 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f6rn2" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.317546 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.317575 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.317586 2574 server.go:85] "Starting device plugin registration server" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.317819 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.317830 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.317922 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.318008 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.318016 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.318569 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:43.326186 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.318608 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.390882 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.390846 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:43.392103 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.392086 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:43.392191 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.392130 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:43.392191 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.392153 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:43.392191 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.392160 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:43.392302 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.392197 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:43.395673 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.395600 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:43.418548 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.418527 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.419729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.419714 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.419806 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.419743 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.419806 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.419755 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.419806 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.419784 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.430940 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.430920 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.431001 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.430944 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-158.ec2.internal\": node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.477022 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.476989 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.492323 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.492294 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal"] Apr 16 20:11:43.492454 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.492396 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.494216 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.494201 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.494313 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.494229 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.494313 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.494240 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.495381 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.495370 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.495531 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.495518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.495567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.495546 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.496116 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.496090 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.496195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.496122 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.496195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.496134 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.496195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.496143 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.496195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.496146 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.496195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.496153 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.497239 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.497223 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.497330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.497252 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.497840 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.497822 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.497926 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.497849 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.497926 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.497859 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.525861 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.525836 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-158.ec2.internal\" not found" node="ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.530264 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.530245 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-158.ec2.internal\" not found" node="ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.559499 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.559466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.559602 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.559502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51b4526a209496dd9377d4f989eaa37c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-158.ec2.internal\" (UID: \"51b4526a209496dd9377d4f989eaa37c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.559602 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.559521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.577988 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.577967 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.660426 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.660360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.660426 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.660398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.660426 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.660418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51b4526a209496dd9377d4f989eaa37c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-158.ec2.internal\" (UID: \"51b4526a209496dd9377d4f989eaa37c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.660564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.660460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.660564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.660493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51b4526a209496dd9377d4f989eaa37c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-158.ec2.internal\" (UID: \"51b4526a209496dd9377d4f989eaa37c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.660564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.660460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.678491 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.678461 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.779310 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.779280 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.827443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.827420 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.833093 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:43.833074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 16 20:11:43.879553 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.879514 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:43.980143 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:43.980052 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:44.080658 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:44.080629 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:44.165193 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.165157 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:44.166010 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.165321 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:44.181321 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:44.181293 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:44.255454 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.255382 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:44.278407 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.277927 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:44.281658 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:44.281634 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 16 20:11:44.284854 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:44.284829 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd77163023509e27e7eae0d866efd9e46.slice/crio-f3fa7aea410930f5c92a3ce00ff4d01fd927474b48011a74b1f4d7a5c520cd6a WatchSource:0}: Error finding container f3fa7aea410930f5c92a3ce00ff4d01fd927474b48011a74b1f4d7a5c520cd6a: Status 404 returned error can't find the container with id f3fa7aea410930f5c92a3ce00ff4d01fd927474b48011a74b1f4d7a5c520cd6a Apr 16 20:11:44.285055 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:44.285043 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b4526a209496dd9377d4f989eaa37c.slice/crio-24e79da0fbdf72db2f6603255a84a258d76bd950ee9dd38d1d6c92228ed2deef WatchSource:0}: Error finding container 24e79da0fbdf72db2f6603255a84a258d76bd950ee9dd38d1d6c92228ed2deef: Status 404 returned error can't find the container with id 24e79da0fbdf72db2f6603255a84a258d76bd950ee9dd38d1d6c92228ed2deef Apr 16 20:11:44.286167 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.286137 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:43 +0000 UTC" deadline="2028-01-19 11:59:09.375200184 +0000 UTC" Apr 16 20:11:44.286167 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.286164 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15423h47m25.089038399s" Apr 16 20:11:44.290130 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.290097 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:44.298170 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.298148 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bvdft" Apr 16 20:11:44.307678 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.307660 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:44.308478 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.308462 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bvdft" Apr 16 20:11:44.358260 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.358227 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 16 20:11:44.363344 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.363322 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:44.370585 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.370557 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:44.372285 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.372271 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 16 20:11:44.381319 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.381303 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:44.395232 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.395179 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" event={"ID":"51b4526a209496dd9377d4f989eaa37c","Type":"ContainerStarted","Data":"24e79da0fbdf72db2f6603255a84a258d76bd950ee9dd38d1d6c92228ed2deef"} Apr 16 20:11:44.396035 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.396016 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" event={"ID":"d77163023509e27e7eae0d866efd9e46","Type":"ContainerStarted","Data":"f3fa7aea410930f5c92a3ce00ff4d01fd927474b48011a74b1f4d7a5c520cd6a"} Apr 16 20:11:44.686813 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:44.686720 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:45.237749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.237712 2574 apiserver.go:52] "Watching apiserver" Apr 16 20:11:45.243920 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.243891 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:45.244340 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.244318 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq","openshift-image-registry/node-ca-djk7r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal","openshift-multus/multus-554gl","openshift-network-operator/iptables-alerter-qx265","openshift-ovn-kubernetes/ovnkube-node-wh5vc","kube-system/konnectivity-agent-m9rqh","openshift-cluster-node-tuning-operator/tuned-7xl9r","openshift-dns/node-resolver-s8ghz","openshift-multus/multus-additional-cni-plugins-zj8xr","openshift-multus/network-metrics-daemon-gmj69","openshift-network-diagnostics/network-check-target-nhjrd"] Apr 16 20:11:45.246209 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.246178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.247161 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.247137 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.248668 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.248470 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gwpsm\"" Apr 16 20:11:45.248767 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.248477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.248767 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.248591 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.249220 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.249201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-554gl" Apr 16 20:11:45.249334 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.249264 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.249405 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.249341 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8bb64\"" Apr 16 20:11:45.249457 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.249411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:45.249510 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.249461 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.250266 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.250244 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.251166 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.251148 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:45.251259 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.251153 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:45.251351 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.251330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.251501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.251487 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.251910 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.251887 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.252283 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.252261 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6dh22\"" Apr 16 20:11:45.252443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.252424 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.254170 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.253265 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.254170 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.253462 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.254170 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.253661 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:45.254170 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.253887 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vc456\"" Apr 16 20:11:45.255600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255154 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.255600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255245 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:45.255600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255279 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:45.255794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255633 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.255794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255654 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:45.255794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255706 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:45.255794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255742 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:45.255794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255636 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.256066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255636 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b48jt\"" Apr 16 20:11:45.256066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255817 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.256066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.255946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.256066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.256055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6tprs\"" Apr 16 20:11:45.256283 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.256127 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.258030 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.258014 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.258459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.258440 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:45.258882 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.258862 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.258970 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.258923 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.259064 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.259045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:45.259181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.259139 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zf9lf\"" Apr 16 20:11:45.259323 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.259305 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wrh2w\"" Apr 16 20:11:45.259584 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.259570 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.259716 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.259691 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:45.260051 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.260034 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:45.260051 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.260049 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:45.260222 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.260060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-p9nzq\"" Apr 16 20:11:45.260814 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.260784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:45.260908 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.260872 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:45.267546 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-device-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.267629 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-node-log\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.267629 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267587 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-registration-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.267795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-cni-netd\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.267795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.267795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-os-release\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.267795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267737 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95a10881-801c-4945-a254-7cb7bf980128-cni-binary-copy\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.267795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267779 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw88h\" (UniqueName: \"kubernetes.io/projected/95a10881-801c-4945-a254-7cb7bf980128-kube-api-access-pw88h\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-log-socket\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-run\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267830 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-host\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267851 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-tuned\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcxlw\" (UniqueName: \"kubernetes.io/projected/174b60ef-32a3-4bd0-a527-a01fa61b76bb-kube-api-access-gcxlw\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267919 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-socket-dir-parent\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267945 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-host\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-cni-bin\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.267980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-cnibin\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-k8s-cni-cncf-io\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.268053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-etc-kubernetes\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-socket-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysconfig\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdc22\" (UniqueName: \"kubernetes.io/projected/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-kube-api-access-mdc22\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-slash\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268192 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-conf-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-sys-fs\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268261 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fbnn\" (UniqueName: \"kubernetes.io/projected/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-kube-api-access-7fbnn\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85mg\" (UniqueName: \"kubernetes.io/projected/469ea03f-422f-4251-bd51-04361b2e17fc-kube-api-access-m85mg\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45d8d50c-84a1-4de5-af66-9216392f6268-ovn-node-metrics-cert\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-os-release\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.268481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-ovnkube-script-lib\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09d0c13a-a560-4064-a9eb-9f9a9df65df6-konnectivity-ca\") pod \"konnectivity-agent-m9rqh\" (UID: \"09d0c13a-a560-4064-a9eb-9f9a9df65df6\") " pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-kubernetes\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysctl-conf\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-systemd\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-systemd\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-cni-multus\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268683 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95a10881-801c-4945-a254-7cb7bf980128-multus-daemon-config\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mlpx\" (UniqueName: \"kubernetes.io/projected/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-kube-api-access-2mlpx\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-hosts-file\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nhb\" (UniqueName: \"kubernetes.io/projected/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-kube-api-access-89nhb\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-cni-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-netns\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.269095 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/469ea03f-422f-4251-bd51-04361b2e17fc-iptables-alerter-script\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.268990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-systemd-units\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-var-lib-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-env-overrides\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-multus-certs\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-etc-selinux\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269122 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-run-ovn-kubernetes\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-modprobe-d\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-tmp\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269190 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-tmp-dir\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-serviceca\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269240 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysctl-d\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-sys\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-cni-bin\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-kubelet\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-ovn\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.269749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cnibin\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-etc-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-ovnkube-config\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-var-lib-kubelet\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269523 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-system-cni-dir\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-hostroot\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/469ea03f-422f-4251-bd51-04361b2e17fc-host-slash\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-run-netns\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flxgq\" (UniqueName: \"kubernetes.io/projected/45d8d50c-84a1-4de5-af66-9216392f6268-kube-api-access-flxgq\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09d0c13a-a560-4064-a9eb-9f9a9df65df6-agent-certs\") pod \"konnectivity-agent-m9rqh\" (UID: \"09d0c13a-a560-4064-a9eb-9f9a9df65df6\") " pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-lib-modules\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269685 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-system-cni-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.270330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.269727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-kubelet\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.310182 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.310150 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:44 +0000 UTC" deadline="2027-12-01 11:51:30.681220435 +0000 UTC" Apr 16 20:11:45.310182 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.310181 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14247h39m45.371042884s" Apr 16 20:11:45.339469 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.339441 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:45.359705 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.359666 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:45.370087 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-ovnkube-config\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370250 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-var-lib-kubelet\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.370250 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-system-cni-dir\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.370250 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-system-cni-dir\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.370250 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-var-lib-kubelet\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-hostroot\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/469ea03f-422f-4251-bd51-04361b2e17fc-host-slash\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-run-netns\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flxgq\" (UniqueName: \"kubernetes.io/projected/45d8d50c-84a1-4de5-af66-9216392f6268-kube-api-access-flxgq\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-hostroot\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09d0c13a-a560-4064-a9eb-9f9a9df65df6-agent-certs\") pod \"konnectivity-agent-m9rqh\" (UID: \"09d0c13a-a560-4064-a9eb-9f9a9df65df6\") " pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-lib-modules\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-system-cni-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/469ea03f-422f-4251-bd51-04361b2e17fc-host-slash\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.370492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-run-netns\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-kubelet\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370560 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-system-cni-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-kubelet\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-device-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-node-log\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-lib-modules\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-registration-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-cni-netd\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-device-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-node-log\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-registration-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-cni-netd\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-ovnkube-config\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-os-release\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.370947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-os-release\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95a10881-801c-4945-a254-7cb7bf980128-cni-binary-copy\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw88h\" (UniqueName: \"kubernetes.io/projected/95a10881-801c-4945-a254-7cb7bf980128-kube-api-access-pw88h\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370845 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-log-socket\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-run\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-host\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-log-socket\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-host\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-run\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.370996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-tuned\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcxlw\" (UniqueName: \"kubernetes.io/projected/174b60ef-32a3-4bd0-a527-a01fa61b76bb-kube-api-access-gcxlw\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-socket-dir-parent\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-socket-dir-parent\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmh6q\" (UniqueName: \"kubernetes.io/projected/604b143f-56b9-4ff2-a025-f1f904de0066-kube-api-access-pmh6q\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-host\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371211 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-cni-bin\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-cnibin\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.371729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-k8s-cni-cncf-io\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371284 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-etc-kubernetes\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-socket-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-cni-bin\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysconfig\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdc22\" (UniqueName: \"kubernetes.io/projected/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-kube-api-access-mdc22\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-host\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-cnibin\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-slash\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371423 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-slash\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-conf-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-conf-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-sys-fs\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-socket-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysconfig\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fbnn\" (UniqueName: \"kubernetes.io/projected/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-kube-api-access-7fbnn\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m85mg\" (UniqueName: \"kubernetes.io/projected/469ea03f-422f-4251-bd51-04361b2e17fc-kube-api-access-m85mg\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-etc-kubernetes\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.372564 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95a10881-801c-4945-a254-7cb7bf980128-cni-binary-copy\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45d8d50c-84a1-4de5-af66-9216392f6268-ovn-node-metrics-cert\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-k8s-cni-cncf-io\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-sys-fs\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-os-release\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-ovnkube-script-lib\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-os-release\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09d0c13a-a560-4064-a9eb-9f9a9df65df6-konnectivity-ca\") pod \"konnectivity-agent-m9rqh\" (UID: \"09d0c13a-a560-4064-a9eb-9f9a9df65df6\") " pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-kubernetes\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysctl-conf\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-systemd\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-kubernetes\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-systemd\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.373459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.371986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-cni-multus\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95a10881-801c-4945-a254-7cb7bf980128-multus-daemon-config\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysctl-conf\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372070 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mlpx\" (UniqueName: \"kubernetes.io/projected/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-kube-api-access-2mlpx\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-hosts-file\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89nhb\" (UniqueName: \"kubernetes.io/projected/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-kube-api-access-89nhb\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-cni-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-netns\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/469ea03f-422f-4251-bd51-04361b2e17fc-iptables-alerter-script\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-systemd-units\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372338 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-ovnkube-script-lib\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-systemd\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-var-lib-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-var-lib-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-env-overrides\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-systemd\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-multus-certs\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09d0c13a-a560-4064-a9eb-9f9a9df65df6-konnectivity-ca\") pod \"konnectivity-agent-m9rqh\" (UID: \"09d0c13a-a560-4064-a9eb-9f9a9df65df6\") " pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372443 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-cni-multus\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-etc-selinux\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-multus-cni-dir\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-etc-selinux\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-netns\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-hosts-file\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-run-ovn-kubernetes\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-run-multus-certs\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-modprobe-d\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.374854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-run-ovn-kubernetes\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-tmp\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-tmp-dir\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-serviceca\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysctl-d\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-sys\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-cni-bin\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-kubelet\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-ovn\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95a10881-801c-4945-a254-7cb7bf980128-multus-daemon-config\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cnibin\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.372957 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-sysctl-d\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-etc-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-tmp-dir\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-run-ovn\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373240 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-etc-openvswitch\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.375620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cnibin\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95a10881-801c-4945-a254-7cb7bf980128-host-var-lib-cni-bin\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-systemd-units\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45d8d50c-84a1-4de5-af66-9216392f6268-host-kubelet\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-sys\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45d8d50c-84a1-4de5-af66-9216392f6268-env-overrides\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/174b60ef-32a3-4bd0-a527-a01fa61b76bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373443 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-modprobe-d\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-serviceca\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.373908 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/469ea03f-422f-4251-bd51-04361b2e17fc-iptables-alerter-script\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.374453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09d0c13a-a560-4064-a9eb-9f9a9df65df6-agent-certs\") pod \"konnectivity-agent-m9rqh\" (UID: \"09d0c13a-a560-4064-a9eb-9f9a9df65df6\") " pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.374763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-etc-tuned\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.375005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45d8d50c-84a1-4de5-af66-9216392f6268-ovn-node-metrics-cert\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.376402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.375544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-tmp\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.378714 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.378691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw88h\" (UniqueName: \"kubernetes.io/projected/95a10881-801c-4945-a254-7cb7bf980128-kube-api-access-pw88h\") pod \"multus-554gl\" (UID: \"95a10881-801c-4945-a254-7cb7bf980128\") " pod="openshift-multus/multus-554gl" Apr 16 20:11:45.379459 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.379427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdc22\" (UniqueName: \"kubernetes.io/projected/bd2529ae-05ec-4e4d-be54-a85e19f1b7b7-kube-api-access-mdc22\") pod \"tuned-7xl9r\" (UID: \"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7\") " pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.381135 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.380856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcxlw\" (UniqueName: \"kubernetes.io/projected/174b60ef-32a3-4bd0-a527-a01fa61b76bb-kube-api-access-gcxlw\") pod \"multus-additional-cni-plugins-zj8xr\" (UID: \"174b60ef-32a3-4bd0-a527-a01fa61b76bb\") " pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.381492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.381267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fbnn\" (UniqueName: \"kubernetes.io/projected/f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e-kube-api-access-7fbnn\") pod \"aws-ebs-csi-driver-node-wn6gq\" (UID: \"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.381492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.381453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxgq\" (UniqueName: \"kubernetes.io/projected/45d8d50c-84a1-4de5-af66-9216392f6268-kube-api-access-flxgq\") pod \"ovnkube-node-wh5vc\" (UID: \"45d8d50c-84a1-4de5-af66-9216392f6268\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.382266 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.382245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mlpx\" (UniqueName: \"kubernetes.io/projected/5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6-kube-api-access-2mlpx\") pod \"node-ca-djk7r\" (UID: \"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6\") " pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.382342 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.382268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nhb\" (UniqueName: \"kubernetes.io/projected/f5a78283-8ec7-49a3-9423-1ae8f58f10ec-kube-api-access-89nhb\") pod \"node-resolver-s8ghz\" (UID: \"f5a78283-8ec7-49a3-9423-1ae8f58f10ec\") " pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.382561 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.382545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85mg\" (UniqueName: \"kubernetes.io/projected/469ea03f-422f-4251-bd51-04361b2e17fc-kube-api-access-m85mg\") pod \"iptables-alerter-qx265\" (UID: \"469ea03f-422f-4251-bd51-04361b2e17fc\") " pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.474260 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.474218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.474260 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.474260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:45.474509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.474311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmh6q\" (UniqueName: \"kubernetes.io/projected/604b143f-56b9-4ff2-a025-f1f904de0066-kube-api-access-pmh6q\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.474509 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.474398 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.474509 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.474473 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.974450981 +0000 UTC m=+3.149520127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.484085 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.484053 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:45.484085 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.484084 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:45.484340 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.484125 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:45.484340 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.484205 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.984185186 +0000 UTC m=+3.159254350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:45.486734 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.486704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmh6q\" (UniqueName: \"kubernetes.io/projected/604b143f-56b9-4ff2-a025-f1f904de0066-kube-api-access-pmh6q\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.559578 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.559436 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8ghz" Apr 16 20:11:45.567412 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.567387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" Apr 16 20:11:45.575053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.575026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-554gl" Apr 16 20:11:45.580696 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.580673 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-djk7r" Apr 16 20:11:45.588276 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.588250 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qx265" Apr 16 20:11:45.594921 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.594900 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:11:45.601586 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.601564 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:11:45.608248 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.608220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" Apr 16 20:11:45.613819 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.613795 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" Apr 16 20:11:45.936577 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:45.936550 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod469ea03f_422f_4251_bd51_04361b2e17fc.slice/crio-4aea231ff0ef760a698f2d408db52b364f1414c5b29d52464552480bcea65ccc WatchSource:0}: Error finding container 4aea231ff0ef760a698f2d408db52b364f1414c5b29d52464552480bcea65ccc: Status 404 returned error can't find the container with id 4aea231ff0ef760a698f2d408db52b364f1414c5b29d52464552480bcea65ccc Apr 16 20:11:45.938265 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:45.938154 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a10881_801c_4945_a254_7cb7bf980128.slice/crio-ac15f85f97e2f6ad7ac393ec4ad202f0a47cdf2667436bbf81a5b71d22d763e1 WatchSource:0}: Error finding container ac15f85f97e2f6ad7ac393ec4ad202f0a47cdf2667436bbf81a5b71d22d763e1: Status 404 returned error can't find the container with id ac15f85f97e2f6ad7ac393ec4ad202f0a47cdf2667436bbf81a5b71d22d763e1 Apr 16 20:11:45.938980 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:45.938923 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f1ca9fc_d31a_4c74_aa4f_c265b81f18f6.slice/crio-338b9d6604b1af50fa5d3b1f50bb4c949ae9385b80af6074e0a872b0be549305 WatchSource:0}: Error finding container 338b9d6604b1af50fa5d3b1f50bb4c949ae9385b80af6074e0a872b0be549305: Status 404 returned error can't find the container with id 338b9d6604b1af50fa5d3b1f50bb4c949ae9385b80af6074e0a872b0be549305 Apr 16 20:11:45.939682 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:45.939649 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f80c5c_dfb0_4e5f_b4cf_a467f3e4c37e.slice/crio-6bf5ff004bdff00a14df35dfd4b6061f7c9857bbc333d1b7b25613c8fd60fb13 WatchSource:0}: Error finding container 6bf5ff004bdff00a14df35dfd4b6061f7c9857bbc333d1b7b25613c8fd60fb13: Status 404 returned error can't find the container with id 6bf5ff004bdff00a14df35dfd4b6061f7c9857bbc333d1b7b25613c8fd60fb13 Apr 16 20:11:45.942126 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:45.941948 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d0c13a_a560_4064_a9eb_9f9a9df65df6.slice/crio-54d0fd3c96c973452a89f78f975b51baf60cf16f686e5ed009948bf9ec61c460 WatchSource:0}: Error finding container 54d0fd3c96c973452a89f78f975b51baf60cf16f686e5ed009948bf9ec61c460: Status 404 returned error can't find the container with id 54d0fd3c96c973452a89f78f975b51baf60cf16f686e5ed009948bf9ec61c460 Apr 16 20:11:45.943299 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:11:45.943264 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d8d50c_84a1_4de5_af66_9216392f6268.slice/crio-e265ca79ce7c271acadb58b100dfe8e9e488110849aad59a8c237ab484bffe9b WatchSource:0}: Error finding container e265ca79ce7c271acadb58b100dfe8e9e488110849aad59a8c237ab484bffe9b: Status 404 returned error can't find the container with id e265ca79ce7c271acadb58b100dfe8e9e488110849aad59a8c237ab484bffe9b Apr 16 20:11:45.977533 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:45.977500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:45.977670 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.977641 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.977722 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:45.977706 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:46.97768865 +0000 UTC m=+4.152757796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:46.077982 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.077947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:46.078193 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.078099 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:46.078193 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.078129 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:46.078193 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.078138 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.078193 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.078189 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:47.078175274 +0000 UTC m=+4.253244423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.310845 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.310719 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:44 +0000 UTC" deadline="2027-11-13 15:56:46.556260346 +0000 UTC" Apr 16 20:11:46.310845 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.310758 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13819h45m0.245506605s" Apr 16 20:11:46.393099 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.393065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:46.393285 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.393206 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:46.405075 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.404420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" event={"ID":"51b4526a209496dd9377d4f989eaa37c","Type":"ContainerStarted","Data":"925aac5c544d4ba033229b853d5df6e8ab9f3b9428b6040f55c8a0e3285e6257"} Apr 16 20:11:46.406832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.406787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8ghz" event={"ID":"f5a78283-8ec7-49a3-9423-1ae8f58f10ec","Type":"ContainerStarted","Data":"5a2b7e49817a437d3aee3638d21c8b993ecc4f40f6baa777b554c12193785f81"} Apr 16 20:11:46.410356 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.410295 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerStarted","Data":"7158fc9adeb35f41e8db8bc0b9799c7ddc8e12a4dc2768352063fef13f529280"} Apr 16 20:11:46.414887 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.414837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"e265ca79ce7c271acadb58b100dfe8e9e488110849aad59a8c237ab484bffe9b"} Apr 16 20:11:46.416538 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.416515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" event={"ID":"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7","Type":"ContainerStarted","Data":"64b2a9f749a7ee3c0037d3001413529001571e67acdbf76285ae821bb3202f94"} Apr 16 20:11:46.418129 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.418085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m9rqh" event={"ID":"09d0c13a-a560-4064-a9eb-9f9a9df65df6","Type":"ContainerStarted","Data":"54d0fd3c96c973452a89f78f975b51baf60cf16f686e5ed009948bf9ec61c460"} Apr 16 20:11:46.422882 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.422823 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" podStartSLOduration=2.422809205 podStartE2EDuration="2.422809205s" podCreationTimestamp="2026-04-16 20:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:46.421724244 +0000 UTC m=+3.596793413" watchObservedRunningTime="2026-04-16 20:11:46.422809205 +0000 UTC m=+3.597878375" Apr 16 20:11:46.425765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.424521 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" event={"ID":"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e","Type":"ContainerStarted","Data":"6bf5ff004bdff00a14df35dfd4b6061f7c9857bbc333d1b7b25613c8fd60fb13"} Apr 16 20:11:46.429064 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.429039 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-djk7r" event={"ID":"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6","Type":"ContainerStarted","Data":"338b9d6604b1af50fa5d3b1f50bb4c949ae9385b80af6074e0a872b0be549305"} Apr 16 20:11:46.430902 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.430878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-554gl" event={"ID":"95a10881-801c-4945-a254-7cb7bf980128","Type":"ContainerStarted","Data":"ac15f85f97e2f6ad7ac393ec4ad202f0a47cdf2667436bbf81a5b71d22d763e1"} Apr 16 20:11:46.433935 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.433911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qx265" event={"ID":"469ea03f-422f-4251-bd51-04361b2e17fc","Type":"ContainerStarted","Data":"4aea231ff0ef760a698f2d408db52b364f1414c5b29d52464552480bcea65ccc"} Apr 16 20:11:46.990845 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:46.990254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:46.990845 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.990416 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:46.990845 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:46.990496 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:48.99047757 +0000 UTC m=+6.165546723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:47.090853 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:47.090810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:47.091055 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:47.091040 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:47.091141 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:47.091062 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:47.091141 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:47.091075 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:47.091249 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:47.091154 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:49.09113415 +0000 UTC m=+6.266203304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:47.393357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:47.393318 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:47.396599 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:47.396549 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:47.451880 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:47.451830 2574 generic.go:358] "Generic (PLEG): container finished" podID="d77163023509e27e7eae0d866efd9e46" containerID="4668272c21cd1a89d2e645ddc47f72bb9e99e303b2e7eadc653d4de01353888b" exitCode=0 Apr 16 20:11:47.452898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:47.452866 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" event={"ID":"d77163023509e27e7eae0d866efd9e46","Type":"ContainerDied","Data":"4668272c21cd1a89d2e645ddc47f72bb9e99e303b2e7eadc653d4de01353888b"} Apr 16 20:11:48.393175 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:48.392645 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:48.393175 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:48.392784 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:48.462158 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:48.461826 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" event={"ID":"d77163023509e27e7eae0d866efd9e46","Type":"ContainerStarted","Data":"6e913416dca95ffaf98d7a163c9eef56990e239cd34fa300aef599c43c4634e8"} Apr 16 20:11:49.011683 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:49.011640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:49.011876 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.011844 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:49.011919 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.011907 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.011890038 +0000 UTC m=+10.186959188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:49.112623 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:49.112584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:49.112828 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.112787 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:49.112828 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.112814 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:49.112828 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.112828 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:49.112997 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.112887 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.112867546 +0000 UTC m=+10.287936707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:49.395966 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:49.395422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:49.395966 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:49.395565 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:50.393243 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:50.393206 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:50.393688 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:50.393360 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:51.392857 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:51.392820 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:51.393019 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:51.392974 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:52.392486 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:52.392389 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:52.392961 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:52.392537 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:53.044152 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.044103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:53.044327 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.044275 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:53.044327 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.044328 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.044309012 +0000 UTC m=+18.219378159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:53.145325 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.145289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:53.145521 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.145499 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:53.145587 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.145522 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:53.145587 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.145536 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:53.145684 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.145604 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.145583769 +0000 UTC m=+18.320652917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:53.393568 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.393526 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:53.394005 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.393653 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:53.445764 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.445695 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" podStartSLOduration=9.445674914 podStartE2EDuration="9.445674914s" podCreationTimestamp="2026-04-16 20:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:48.480398728 +0000 UTC m=+5.655467896" watchObservedRunningTime="2026-04-16 20:11:53.445674914 +0000 UTC m=+10.620744082" Apr 16 20:11:53.446671 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.445975 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tjktw"] Apr 16 20:11:53.451164 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.451091 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.451278 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.451196 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:11:53.549643 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.549432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9a8f93c5-d267-47b5-a685-fc5bd8269d88-dbus\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.549643 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.549480 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.549643 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.549551 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9a8f93c5-d267-47b5-a685-fc5bd8269d88-kubelet-config\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.650825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.650747 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9a8f93c5-d267-47b5-a685-fc5bd8269d88-kubelet-config\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.650985 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.650829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9a8f93c5-d267-47b5-a685-fc5bd8269d88-dbus\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.650985 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.650857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.650985 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.650862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9a8f93c5-d267-47b5-a685-fc5bd8269d88-kubelet-config\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:53.650985 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.650953 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:53.651228 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:53.651005 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.15098695 +0000 UTC m=+11.326056104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:53.651228 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:53.651013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9a8f93c5-d267-47b5-a685-fc5bd8269d88-dbus\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:54.155455 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:54.155368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:54.155635 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:54.155491 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:54.155635 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:54.155570 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:55.155551002 +0000 UTC m=+12.330620167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:54.392397 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:54.392356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:54.392585 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:54.392482 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:55.167335 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:55.167298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:55.167762 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:55.167434 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:55.167762 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:55.167492 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.167479462 +0000 UTC m=+14.342548613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:55.392593 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:55.392506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:55.392744 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:55.392506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:55.392744 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:55.392643 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:55.392744 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:55.392722 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:11:56.393136 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:56.393095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:56.393528 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:56.393227 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:57.184246 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:57.184209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:57.184422 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:57.184335 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:57.184422 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:57.184399 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.184385209 +0000 UTC m=+18.359454356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:57.393386 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:57.393347 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:57.393745 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:57.393496 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:57.393745 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:57.393353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:57.393745 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:57.393601 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:11:58.392589 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:58.392544 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:11:58.392781 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:58.392685 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:11:59.392649 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:59.392607 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:11:59.393092 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:11:59.392663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:11:59.393092 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:59.392752 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:11:59.393092 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:11:59.392879 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:00.392783 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:00.392739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:00.393272 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:00.392873 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:01.113665 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:01.113624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:01.113928 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.113794 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:01.113928 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.113865 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:17.113844742 +0000 UTC m=+34.288913888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:01.214334 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:01.214289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:01.214499 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:01.214367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:01.214499 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.214478 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:01.214611 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.214528 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:09.214515641 +0000 UTC m=+26.389584790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:01.214611 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.214479 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:01.214611 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.214580 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:01.214611 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.214594 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:01.214783 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.214643 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:17.21463007 +0000 UTC m=+34.389699240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:01.393509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:01.393421 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:01.393954 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:01.393421 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:01.393954 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.393577 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:01.393954 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:01.393624 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:02.392847 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:02.392818 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:02.393032 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:02.392941 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:03.396600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.395838 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:03.396600 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:03.396213 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:03.396600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.396335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:03.396600 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:03.396429 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:03.509029 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.507310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8ghz" event={"ID":"f5a78283-8ec7-49a3-9423-1ae8f58f10ec","Type":"ContainerStarted","Data":"6157a924c895e8de7bcd6fbf311760f5a6df83b17a720d37a50677caba42d67e"} Apr 16 20:12:03.514842 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.514743 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" event={"ID":"bd2529ae-05ec-4e4d-be54-a85e19f1b7b7","Type":"ContainerStarted","Data":"3d78e9bc54c745f7c36db9a5b05351cde8cf77912ae5fd212fdef6e1b4081510"} Apr 16 20:12:03.517620 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.517325 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m9rqh" event={"ID":"09d0c13a-a560-4064-a9eb-9f9a9df65df6","Type":"ContainerStarted","Data":"5900339a42bf4058f386431886dc51f39536cce4101342804ae97d49f77cb325"} Apr 16 20:12:03.529837 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.529736 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s8ghz" podStartSLOduration=3.28034993 podStartE2EDuration="20.5297184s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.948706502 +0000 UTC m=+3.123775650" lastFinishedPulling="2026-04-16 20:12:03.19807496 +0000 UTC m=+20.373144120" observedRunningTime="2026-04-16 20:12:03.529477523 +0000 UTC m=+20.704546691" watchObservedRunningTime="2026-04-16 20:12:03.5297184 +0000 UTC m=+20.704787569" Apr 16 20:12:03.575169 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.574867 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-m9rqh" podStartSLOduration=7.054972774 podStartE2EDuration="20.574851184s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.943952639 +0000 UTC m=+3.119021798" lastFinishedPulling="2026-04-16 20:11:59.463831052 +0000 UTC m=+16.638900208" observedRunningTime="2026-04-16 20:12:03.551050958 +0000 UTC m=+20.726120127" watchObservedRunningTime="2026-04-16 20:12:03.574851184 +0000 UTC m=+20.749920353" Apr 16 20:12:03.575303 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:03.575213 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7xl9r" podStartSLOduration=3.241996377 podStartE2EDuration="20.575199713s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.948410311 +0000 UTC m=+3.123479456" lastFinishedPulling="2026-04-16 20:12:03.281613632 +0000 UTC m=+20.456682792" observedRunningTime="2026-04-16 20:12:03.57414469 +0000 UTC m=+20.749213862" watchObservedRunningTime="2026-04-16 20:12:03.575199713 +0000 UTC m=+20.750268882" Apr 16 20:12:04.351253 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.351068 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:04.392392 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.392343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:04.392546 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:04.392481 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:04.521077 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.521034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" event={"ID":"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e","Type":"ContainerStarted","Data":"b571ccfaa8456e9a0531fe41e1a0990f7283a500319e04962ee80d8da49c3b8c"} Apr 16 20:12:04.521077 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.521074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" event={"ID":"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e","Type":"ContainerStarted","Data":"388577c5a684c8ffff1edce944f3c2c5eeabac7473b37d21744d0b116cb0d41a"} Apr 16 20:12:04.522465 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.522427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-djk7r" event={"ID":"5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6","Type":"ContainerStarted","Data":"326fab7034b55d277e8e17ba6e62b32e2833c1bf759c65e28602c31c2aea2da0"} Apr 16 20:12:04.523816 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.523793 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-554gl" event={"ID":"95a10881-801c-4945-a254-7cb7bf980128","Type":"ContainerStarted","Data":"309b88d703aa3c48b85205bfb9a5be86c4ecbe4ad22a0103a533d2ea7d518add"} Apr 16 20:12:04.525033 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.525010 2574 generic.go:358] "Generic (PLEG): container finished" podID="174b60ef-32a3-4bd0-a527-a01fa61b76bb" containerID="e87dbce0a8ba05d2c943277a0f7618bfc3507b79cec2bed2e4094921caef926e" exitCode=0 Apr 16 20:12:04.525168 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.525089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerDied","Data":"e87dbce0a8ba05d2c943277a0f7618bfc3507b79cec2bed2e4094921caef926e"} Apr 16 20:12:04.527760 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.527735 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"6662c74ae9c01591b2d153c97ebebabda4bca57ce010c459ace3b2b655240f0d"} Apr 16 20:12:04.527831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.527769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"42d6e38f2ea171f97099d9b75939204edcb6ecdc97d7618a67cfbbac90d90a8c"} Apr 16 20:12:04.527831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.527780 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"9cc912e021db1e6e4784c44dbb7fe369be15cda85af61a8075cab78a0d30bd62"} Apr 16 20:12:04.527831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.527788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"fefc7166822a8baa18739d81b935c3c630d65873d0dfddb41c91c1ca007fb664"} Apr 16 20:12:04.527831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.527796 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"b66f92fc2eb152d741da0293f191b3d9a5f3769879d0ae25eefebe10251beb06"} Apr 16 20:12:04.527831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.527806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"f64147f2dd7c3083ccb5f90d5e0f1957fa33cbedbb2ee6c2ae51123b57cd404f"} Apr 16 20:12:04.541866 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.541824 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-djk7r" podStartSLOduration=4.284692859 podStartE2EDuration="21.541814098s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.940961413 +0000 UTC m=+3.116030564" lastFinishedPulling="2026-04-16 20:12:03.198082653 +0000 UTC m=+20.373151803" observedRunningTime="2026-04-16 20:12:04.541705073 +0000 UTC m=+21.716774252" watchObservedRunningTime="2026-04-16 20:12:04.541814098 +0000 UTC m=+21.716883269" Apr 16 20:12:04.567631 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:04.567588 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-554gl" podStartSLOduration=4.187252571 podStartE2EDuration="21.567578346s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.94026193 +0000 UTC m=+3.115331090" lastFinishedPulling="2026-04-16 20:12:03.320587702 +0000 UTC m=+20.495656865" observedRunningTime="2026-04-16 20:12:04.567396162 +0000 UTC m=+21.742465330" watchObservedRunningTime="2026-04-16 20:12:04.567578346 +0000 UTC m=+21.742647515" Apr 16 20:12:05.331507 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.331393 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:04.351249746Z","UUID":"e41867ec-4239-4bfe-b42f-c6d0e204b9ff","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:05.334739 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.334714 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:05.334739 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.334747 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:05.393376 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.393342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:05.393555 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.393383 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:05.393555 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:05.393498 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:05.393667 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:05.393639 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:05.533658 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.533617 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" event={"ID":"f9f80c5c-dfb0-4e5f-b4cf-a467f3e4c37e","Type":"ContainerStarted","Data":"ae54efbfc45ecc53a08e3050e3e24c1d5a2ce4d018359144ff2095617a6f7f1a"} Apr 16 20:12:05.535356 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.535311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qx265" event={"ID":"469ea03f-422f-4251-bd51-04361b2e17fc","Type":"ContainerStarted","Data":"9007c8a8da9d1b82f1178ffbf4e6a76cd155072c69d60d995ecf6b0744135723"} Apr 16 20:12:05.550736 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.550672 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wn6gq" podStartSLOduration=3.320021528 podStartE2EDuration="22.550653772s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.941724817 +0000 UTC m=+3.116793963" lastFinishedPulling="2026-04-16 20:12:05.172357054 +0000 UTC m=+22.347426207" observedRunningTime="2026-04-16 20:12:05.550194771 +0000 UTC m=+22.725263940" watchObservedRunningTime="2026-04-16 20:12:05.550653772 +0000 UTC m=+22.725722924" Apr 16 20:12:05.563996 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.563941 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qx265" podStartSLOduration=5.220106254 podStartE2EDuration="22.563925847s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.938471163 +0000 UTC m=+3.113540323" lastFinishedPulling="2026-04-16 20:12:03.282290764 +0000 UTC m=+20.457359916" observedRunningTime="2026-04-16 20:12:05.56361434 +0000 UTC m=+22.738683525" watchObservedRunningTime="2026-04-16 20:12:05.563925847 +0000 UTC m=+22.738995094" Apr 16 20:12:05.773586 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:05.773554 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s8ghz_f5a78283-8ec7-49a3-9423-1ae8f58f10ec/dns-node-resolver/0.log" Apr 16 20:12:06.392987 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:06.392899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:06.393176 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:06.393006 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:06.540386 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:06.540352 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"ac302556202176b22c436655406142146c94a3e1f60640cdc745d7f186a6ff64"} Apr 16 20:12:06.561474 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:06.561447 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-djk7r_5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6/node-ca/0.log" Apr 16 20:12:07.001398 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:07.001354 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:12:07.002041 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:07.002021 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:12:07.392501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:07.392419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:07.392501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:07.392419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:07.392719 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:07.392534 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:07.392719 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:07.392615 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:07.541996 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:07.541964 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:12:07.542589 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:07.542545 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-m9rqh" Apr 16 20:12:08.392705 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.392569 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:08.392803 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:08.392785 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:08.549587 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.549541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerStarted","Data":"69ec3ee31a3b80fc6a33efa289d51b38813d3e69aee21827b241c2dfc7bf20b3"} Apr 16 20:12:08.553080 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.553052 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" event={"ID":"45d8d50c-84a1-4de5-af66-9216392f6268","Type":"ContainerStarted","Data":"8b9392ca7402fe230664b697f5c4dc158713c9aa87de2509b9d5dde1f794ba55"} Apr 16 20:12:08.553532 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.553493 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:12:08.553532 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.553524 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:12:08.569315 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.569291 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:12:08.691879 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:08.691787 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" podStartSLOduration=8.130317078 podStartE2EDuration="25.691772994s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.945759419 +0000 UTC m=+3.120828572" lastFinishedPulling="2026-04-16 20:12:03.507215324 +0000 UTC m=+20.682284488" observedRunningTime="2026-04-16 20:12:08.691720642 +0000 UTC m=+25.866789808" watchObservedRunningTime="2026-04-16 20:12:08.691772994 +0000 UTC m=+25.866842161" Apr 16 20:12:09.278982 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.278943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:09.279178 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:09.279066 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:09.279178 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:09.279142 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:25.279127404 +0000 UTC m=+42.454196551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:09.392912 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.392873 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:09.393086 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:09.392981 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:09.393086 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.393071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:09.393223 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:09.393198 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:09.556563 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.556479 2574 generic.go:358] "Generic (PLEG): container finished" podID="174b60ef-32a3-4bd0-a527-a01fa61b76bb" containerID="69ec3ee31a3b80fc6a33efa289d51b38813d3e69aee21827b241c2dfc7bf20b3" exitCode=0 Apr 16 20:12:09.556968 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.556557 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerDied","Data":"69ec3ee31a3b80fc6a33efa289d51b38813d3e69aee21827b241c2dfc7bf20b3"} Apr 16 20:12:09.558035 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.557253 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:12:09.571708 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:09.571685 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:12:10.393313 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:10.393270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:10.393462 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:10.393383 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:10.560326 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:10.560293 2574 generic.go:358] "Generic (PLEG): container finished" podID="174b60ef-32a3-4bd0-a527-a01fa61b76bb" containerID="913e1726466782b79efe715b7188be4d2fa85fbe8b548c26ebd9bdefae29baef" exitCode=0 Apr 16 20:12:10.560688 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:10.560382 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerDied","Data":"913e1726466782b79efe715b7188be4d2fa85fbe8b548c26ebd9bdefae29baef"} Apr 16 20:12:11.393421 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:11.393380 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:11.393625 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:11.393380 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:11.393625 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:11.393509 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:11.393625 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:11.393587 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:11.564277 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:11.564235 2574 generic.go:358] "Generic (PLEG): container finished" podID="174b60ef-32a3-4bd0-a527-a01fa61b76bb" containerID="46b4b6fcbeb29eb85cf9f9952ea0caac79f3036dac73da12b7b9332f0bf177e2" exitCode=0 Apr 16 20:12:11.564701 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:11.564360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerDied","Data":"46b4b6fcbeb29eb85cf9f9952ea0caac79f3036dac73da12b7b9332f0bf177e2"} Apr 16 20:12:12.393007 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:12.392970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:12.393204 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:12.393120 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:13.393750 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:13.393711 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:13.394218 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:13.393821 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:13.394218 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:13.393914 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:13.394218 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:13.394035 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:14.393186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:14.393159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:14.393367 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:14.393268 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:15.392440 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:15.392403 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:15.392917 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:15.392417 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:15.392917 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:15.392559 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:15.392917 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:15.392581 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:16.392903 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:16.392854 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:16.393498 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:16.392976 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:17.145255 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:17.145210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:17.145450 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.145356 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:17.145450 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.145422 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs podName:604b143f-56b9-4ff2-a025-f1f904de0066 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:49.145405456 +0000 UTC m=+66.320474602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs") pod "network-metrics-daemon-gmj69" (UID: "604b143f-56b9-4ff2-a025-f1f904de0066") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:17.246189 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:17.246147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:17.246366 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.246316 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:17.246366 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.246349 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:17.246366 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.246359 2574 projected.go:194] Error preparing data for projected volume kube-api-access-s5zf4 for pod openshift-network-diagnostics/network-check-target-nhjrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:17.246464 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.246413 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4 podName:3251e838-9ac0-43bc-88bb-3f2002d4ad60 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:49.246398407 +0000 UTC m=+66.421467555 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s5zf4" (UniqueName: "kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4") pod "network-check-target-nhjrd" (UID: "3251e838-9ac0-43bc-88bb-3f2002d4ad60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:17.393247 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:17.393213 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:17.393604 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:17.393225 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:17.393604 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.393349 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:17.393604 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:17.393460 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:18.392926 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:18.392891 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:18.393099 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:18.393006 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:18.579566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:18.579529 2574 generic.go:358] "Generic (PLEG): container finished" podID="174b60ef-32a3-4bd0-a527-a01fa61b76bb" containerID="5b113ce75b196b6933173f2fe6c6671e4ef2b25536d6f8de6f6c925ad2389d42" exitCode=0 Apr 16 20:12:18.579910 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:18.579592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerDied","Data":"5b113ce75b196b6933173f2fe6c6671e4ef2b25536d6f8de6f6c925ad2389d42"} Apr 16 20:12:19.393389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:19.393350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:19.393599 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:19.393350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:19.393599 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:19.393478 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:19.393599 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:19.393545 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:19.584345 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:19.584314 2574 generic.go:358] "Generic (PLEG): container finished" podID="174b60ef-32a3-4bd0-a527-a01fa61b76bb" containerID="6b4d1c1eef88a804cc325e8b25fc737af2db6c4fe02980f81bc55c064bb3414d" exitCode=0 Apr 16 20:12:19.584747 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:19.584360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerDied","Data":"6b4d1c1eef88a804cc325e8b25fc737af2db6c4fe02980f81bc55c064bb3414d"} Apr 16 20:12:20.392727 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:20.392692 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:20.392899 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:20.392806 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:20.589062 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:20.589027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" event={"ID":"174b60ef-32a3-4bd0-a527-a01fa61b76bb","Type":"ContainerStarted","Data":"24a28e320c94290d343ac526422e0799ecd36d3d0c2acb960e944b92f6d6864b"} Apr 16 20:12:20.612648 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:20.612603 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zj8xr" podStartSLOduration=5.954581832 podStartE2EDuration="37.612587986s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.946619497 +0000 UTC m=+3.121688654" lastFinishedPulling="2026-04-16 20:12:17.604625657 +0000 UTC m=+34.779694808" observedRunningTime="2026-04-16 20:12:20.611517529 +0000 UTC m=+37.786586697" watchObservedRunningTime="2026-04-16 20:12:20.612587986 +0000 UTC m=+37.787657169" Apr 16 20:12:21.393267 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:21.393235 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:21.393415 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:21.393345 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:21.393481 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:21.393431 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:21.393556 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:21.393534 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:22.392785 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:22.392752 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:22.393179 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:22.392853 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:23.393783 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:23.393748 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:23.394233 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:23.393834 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:23.394233 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:23.393874 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:23.394233 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:23.393924 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:24.392502 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:24.392470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:24.392700 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:24.392587 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:25.303587 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:25.303547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:25.303957 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:25.303668 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:25.303957 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:25.303717 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret podName:9a8f93c5-d267-47b5-a685-fc5bd8269d88 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:57.303705755 +0000 UTC m=+74.478774902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret") pod "global-pull-secret-syncer-tjktw" (UID: "9a8f93c5-d267-47b5-a685-fc5bd8269d88") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:25.392649 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:25.392611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:25.392824 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:25.392611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:25.392824 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:25.392727 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:25.392824 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:25.392788 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:26.393212 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:26.393182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:26.393650 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:26.393301 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:27.393185 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:27.393149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:27.393361 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:27.393162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:27.393361 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:27.393248 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:27.393666 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:27.393353 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:28.345704 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:28.345670 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tjktw"] Apr 16 20:12:28.345879 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:28.345859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:28.345995 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:28.345974 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:28.348420 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:28.348393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nhjrd"] Apr 16 20:12:28.348691 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:28.348672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:28.348905 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:28.348884 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:28.349168 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:28.349146 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gmj69"] Apr 16 20:12:28.349371 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:28.349351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:28.349482 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:28.349461 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:30.393091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:30.392905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:30.393747 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:30.392948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:30.393747 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:30.393196 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:30.393747 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:30.392974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:30.393747 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:30.393242 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:30.393747 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:30.393299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:32.392810 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:32.392767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:32.392810 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:32.392793 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:32.393505 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:32.392872 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nhjrd" podUID="3251e838-9ac0-43bc-88bb-3f2002d4ad60" Apr 16 20:12:32.393505 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:32.392906 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:32.393505 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:32.392990 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmj69" podUID="604b143f-56b9-4ff2-a025-f1f904de0066" Apr 16 20:12:32.393505 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:32.393056 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tjktw" podUID="9a8f93c5-d267-47b5-a685-fc5bd8269d88" Apr 16 20:12:33.170307 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.170231 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeReady" Apr 16 20:12:33.170454 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.170365 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:33.217119 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.217076 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h7pxk"] Apr 16 20:12:33.247671 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.247643 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bd6r6"] Apr 16 20:12:33.247844 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.247794 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.250703 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.250678 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:33.250850 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.250678 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tnjdz\"" Apr 16 20:12:33.250850 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.250680 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:33.267230 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.267208 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h7pxk"] Apr 16 20:12:33.267230 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.267233 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bd6r6"] Apr 16 20:12:33.267391 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.267244 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vq5vz"] Apr 16 20:12:33.267391 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.267359 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.269870 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.269820 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:12:33.269997 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.269854 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:12:33.269997 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.269928 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:12:33.269997 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.269934 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bmglr\"" Apr 16 20:12:33.270349 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.270335 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:12:33.294729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.294700 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vq5vz"] Apr 16 20:12:33.294866 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.294815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.298207 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.298185 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v7v7t\"" Apr 16 20:12:33.298628 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.298612 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:33.298723 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.298668 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:33.299002 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.298988 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:33.361203 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws95f\" (UniqueName: \"kubernetes.io/projected/2380cbc7-d39a-4681-ac0d-6e245781eba4-kube-api-access-ws95f\") pod \"ingress-canary-vq5vz\" (UID: \"2380cbc7-d39a-4681-ac0d-6e245781eba4\") " pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.361359 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/913f758d-f60f-4832-a358-f15c2a5f2709-crio-socket\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.361359 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/913f758d-f60f-4832-a358-f15c2a5f2709-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.361359 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/913f758d-f60f-4832-a358-f15c2a5f2709-data-volume\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.361359 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2380cbc7-d39a-4681-ac0d-6e245781eba4-cert\") pod \"ingress-canary-vq5vz\" (UID: \"2380cbc7-d39a-4681-ac0d-6e245781eba4\") " pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.361359 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1897c5f1-ae77-47b3-96cf-15366561bfa3-config-volume\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.361516 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1897c5f1-ae77-47b3-96cf-15366561bfa3-metrics-tls\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.361516 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1897c5f1-ae77-47b3-96cf-15366561bfa3-tmp-dir\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.361516 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rsst\" (UniqueName: \"kubernetes.io/projected/1897c5f1-ae77-47b3-96cf-15366561bfa3-kube-api-access-8rsst\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.361516 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/913f758d-f60f-4832-a358-f15c2a5f2709-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.361633 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.361521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2wr8\" (UniqueName: \"kubernetes.io/projected/913f758d-f60f-4832-a358-f15c2a5f2709-kube-api-access-b2wr8\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.462832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/913f758d-f60f-4832-a358-f15c2a5f2709-crio-socket\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.462832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/913f758d-f60f-4832-a358-f15c2a5f2709-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.462832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/913f758d-f60f-4832-a358-f15c2a5f2709-data-volume\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.462832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2380cbc7-d39a-4681-ac0d-6e245781eba4-cert\") pod \"ingress-canary-vq5vz\" (UID: \"2380cbc7-d39a-4681-ac0d-6e245781eba4\") " pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1897c5f1-ae77-47b3-96cf-15366561bfa3-config-volume\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1897c5f1-ae77-47b3-96cf-15366561bfa3-metrics-tls\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.462948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/913f758d-f60f-4832-a358-f15c2a5f2709-crio-socket\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1897c5f1-ae77-47b3-96cf-15366561bfa3-tmp-dir\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rsst\" (UniqueName: \"kubernetes.io/projected/1897c5f1-ae77-47b3-96cf-15366561bfa3-kube-api-access-8rsst\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/913f758d-f60f-4832-a358-f15c2a5f2709-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2wr8\" (UniqueName: \"kubernetes.io/projected/913f758d-f60f-4832-a358-f15c2a5f2709-kube-api-access-b2wr8\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws95f\" (UniqueName: \"kubernetes.io/projected/2380cbc7-d39a-4681-ac0d-6e245781eba4-kube-api-access-ws95f\") pod \"ingress-canary-vq5vz\" (UID: \"2380cbc7-d39a-4681-ac0d-6e245781eba4\") " pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.463443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1897c5f1-ae77-47b3-96cf-15366561bfa3-tmp-dir\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.463829 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.463514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1897c5f1-ae77-47b3-96cf-15366561bfa3-config-volume\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.466808 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.466788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1897c5f1-ae77-47b3-96cf-15366561bfa3-metrics-tls\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.466918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.466877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2380cbc7-d39a-4681-ac0d-6e245781eba4-cert\") pod \"ingress-canary-vq5vz\" (UID: \"2380cbc7-d39a-4681-ac0d-6e245781eba4\") " pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.471473 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.471449 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rsst\" (UniqueName: \"kubernetes.io/projected/1897c5f1-ae77-47b3-96cf-15366561bfa3-kube-api-access-8rsst\") pod \"dns-default-h7pxk\" (UID: \"1897c5f1-ae77-47b3-96cf-15366561bfa3\") " pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.471586 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.471566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws95f\" (UniqueName: \"kubernetes.io/projected/2380cbc7-d39a-4681-ac0d-6e245781eba4-kube-api-access-ws95f\") pod \"ingress-canary-vq5vz\" (UID: \"2380cbc7-d39a-4681-ac0d-6e245781eba4\") " pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.477192 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.477168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/913f758d-f60f-4832-a358-f15c2a5f2709-data-volume\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.478979 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.478948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/913f758d-f60f-4832-a358-f15c2a5f2709-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.479078 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.479060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2wr8\" (UniqueName: \"kubernetes.io/projected/913f758d-f60f-4832-a358-f15c2a5f2709-kube-api-access-b2wr8\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.489784 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.489761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/913f758d-f60f-4832-a358-f15c2a5f2709-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bd6r6\" (UID: \"913f758d-f60f-4832-a358-f15c2a5f2709\") " pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.557233 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.557201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:33.575799 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.575771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bd6r6" Apr 16 20:12:33.603530 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.603497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vq5vz" Apr 16 20:12:33.740089 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.740060 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h7pxk"] Apr 16 20:12:33.746286 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.746267 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bd6r6"] Apr 16 20:12:33.751384 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:33.751354 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1897c5f1_ae77_47b3_96cf_15366561bfa3.slice/crio-33b101f766250c7daef215cdf5950112e6d502db707896bc40203747f7012dff WatchSource:0}: Error finding container 33b101f766250c7daef215cdf5950112e6d502db707896bc40203747f7012dff: Status 404 returned error can't find the container with id 33b101f766250c7daef215cdf5950112e6d502db707896bc40203747f7012dff Apr 16 20:12:33.751570 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:33.751553 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913f758d_f60f_4832_a358_f15c2a5f2709.slice/crio-67a9da31067da2ba7adb86da6312b1ee990326e5176a008cfd19f25f26143168 WatchSource:0}: Error finding container 67a9da31067da2ba7adb86da6312b1ee990326e5176a008cfd19f25f26143168: Status 404 returned error can't find the container with id 67a9da31067da2ba7adb86da6312b1ee990326e5176a008cfd19f25f26143168 Apr 16 20:12:33.751888 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:33.751866 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vq5vz"] Apr 16 20:12:33.761606 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:33.761583 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2380cbc7_d39a_4681_ac0d_6e245781eba4.slice/crio-60d907202656d2bbfa4907251edab65fc32a21fdf7e653136b37fc2072d99f56 WatchSource:0}: Error finding container 60d907202656d2bbfa4907251edab65fc32a21fdf7e653136b37fc2072d99f56: Status 404 returned error can't find the container with id 60d907202656d2bbfa4907251edab65fc32a21fdf7e653136b37fc2072d99f56 Apr 16 20:12:34.392644 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.392396 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:34.392833 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.392419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:34.393131 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.392481 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:34.397310 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.397241 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:34.397827 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.397547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:34.397827 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.397605 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b2xrn\"" Apr 16 20:12:34.397827 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.397661 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm2hn\"" Apr 16 20:12:34.397827 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.397607 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:34.397827 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.397607 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:34.618198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.618100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bd6r6" event={"ID":"913f758d-f60f-4832-a358-f15c2a5f2709","Type":"ContainerStarted","Data":"323a2077720bcfe9c666fcce1add9eee58fd439644d1bff418ae839a7ff5ced1"} Apr 16 20:12:34.618198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.618163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bd6r6" event={"ID":"913f758d-f60f-4832-a358-f15c2a5f2709","Type":"ContainerStarted","Data":"cd743792185fa4febe1d445aba95e36330e2aa2173a6c156f4022344c93e4010"} Apr 16 20:12:34.618198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.618176 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bd6r6" event={"ID":"913f758d-f60f-4832-a358-f15c2a5f2709","Type":"ContainerStarted","Data":"67a9da31067da2ba7adb86da6312b1ee990326e5176a008cfd19f25f26143168"} Apr 16 20:12:34.619447 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.619417 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vq5vz" event={"ID":"2380cbc7-d39a-4681-ac0d-6e245781eba4","Type":"ContainerStarted","Data":"60d907202656d2bbfa4907251edab65fc32a21fdf7e653136b37fc2072d99f56"} Apr 16 20:12:34.620695 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:34.620657 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h7pxk" event={"ID":"1897c5f1-ae77-47b3-96cf-15366561bfa3","Type":"ContainerStarted","Data":"33b101f766250c7daef215cdf5950112e6d502db707896bc40203747f7012dff"} Apr 16 20:12:36.628042 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.627752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bd6r6" event={"ID":"913f758d-f60f-4832-a358-f15c2a5f2709","Type":"ContainerStarted","Data":"679f7219c015eab80b0ea91dbb0255c2d598357ab2a10ee4430f3f100521e935"} Apr 16 20:12:36.629145 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.629121 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vq5vz" event={"ID":"2380cbc7-d39a-4681-ac0d-6e245781eba4","Type":"ContainerStarted","Data":"359458b78774e5c9e71645caa4be45fb01a88e5f30865a6699048cac902b197c"} Apr 16 20:12:36.630570 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.630551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h7pxk" event={"ID":"1897c5f1-ae77-47b3-96cf-15366561bfa3","Type":"ContainerStarted","Data":"3b6e23254bb1fe994081c8cf8706f66b305cb435c28becaccf2ab9e20fb78526"} Apr 16 20:12:36.630636 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.630577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h7pxk" event={"ID":"1897c5f1-ae77-47b3-96cf-15366561bfa3","Type":"ContainerStarted","Data":"f304ad8939d8d3e44583f09609def9f30bee9654a2c407c77e67ee987df16f61"} Apr 16 20:12:36.630684 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.630671 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:36.646706 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.646663 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bd6r6" podStartSLOduration=1.3308409939999999 podStartE2EDuration="3.646650211s" podCreationTimestamp="2026-04-16 20:12:33 +0000 UTC" firstStartedPulling="2026-04-16 20:12:33.859211206 +0000 UTC m=+51.034280356" lastFinishedPulling="2026-04-16 20:12:36.175020412 +0000 UTC m=+53.350089573" observedRunningTime="2026-04-16 20:12:36.64551513 +0000 UTC m=+53.820584321" watchObservedRunningTime="2026-04-16 20:12:36.646650211 +0000 UTC m=+53.821719373" Apr 16 20:12:36.661777 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.661693 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h7pxk" podStartSLOduration=1.295400747 podStartE2EDuration="3.66167689s" podCreationTimestamp="2026-04-16 20:12:33 +0000 UTC" firstStartedPulling="2026-04-16 20:12:33.753263477 +0000 UTC m=+50.928332623" lastFinishedPulling="2026-04-16 20:12:36.119539619 +0000 UTC m=+53.294608766" observedRunningTime="2026-04-16 20:12:36.660608615 +0000 UTC m=+53.835677782" watchObservedRunningTime="2026-04-16 20:12:36.66167689 +0000 UTC m=+53.836746060" Apr 16 20:12:36.676027 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:36.675986 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vq5vz" podStartSLOduration=1.31977348 podStartE2EDuration="3.675965802s" podCreationTimestamp="2026-04-16 20:12:33 +0000 UTC" firstStartedPulling="2026-04-16 20:12:33.76337453 +0000 UTC m=+50.938443676" lastFinishedPulling="2026-04-16 20:12:36.11956684 +0000 UTC m=+53.294635998" observedRunningTime="2026-04-16 20:12:36.675486781 +0000 UTC m=+53.850555950" watchObservedRunningTime="2026-04-16 20:12:36.675965802 +0000 UTC m=+53.851034970" Apr 16 20:12:39.204166 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.204131 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bfc9bb6c9-d9tjd"] Apr 16 20:12:39.207037 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.207018 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.209367 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.209334 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:12:39.209472 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.209396 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:12:39.209472 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.209402 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:12:39.210407 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.210384 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:12:39.210491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.210455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:12:39.210644 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.210626 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:12:39.210706 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.210629 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:12:39.211126 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.211089 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qtq7x\"" Apr 16 20:12:39.216527 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.216348 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bfc9bb6c9-d9tjd"] Apr 16 20:12:39.308330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.308285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-oauth-serving-cert\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.308330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.308331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd948\" (UniqueName: \"kubernetes.io/projected/7656bf12-277a-4f69-8c48-491875a7c565-kube-api-access-hd948\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.308549 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.308362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-oauth-config\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.308549 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.308398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-serving-cert\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.308549 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.308421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-service-ca\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.308549 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.308492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-console-config\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.409689 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.409661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-oauth-serving-cert\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.409689 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.409693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd948\" (UniqueName: \"kubernetes.io/projected/7656bf12-277a-4f69-8c48-491875a7c565-kube-api-access-hd948\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.409918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.409713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-oauth-config\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.409918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.409736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-serving-cert\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.409918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.409760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-service-ca\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.409918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.409899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-console-config\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.411260 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.411236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-service-ca\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.411415 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.411258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-oauth-serving-cert\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.411415 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.411281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-console-config\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.413717 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.413695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-oauth-config\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.413815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.413788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-serving-cert\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.417703 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.417681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd948\" (UniqueName: \"kubernetes.io/projected/7656bf12-277a-4f69-8c48-491875a7c565-kube-api-access-hd948\") pod \"console-7bfc9bb6c9-d9tjd\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.520063 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.520030 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:39.636386 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.636354 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bfc9bb6c9-d9tjd"] Apr 16 20:12:39.639492 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:39.639462 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7656bf12_277a_4f69_8c48_491875a7c565.slice/crio-3b4b290d1e8a096634710b277798c735abbed16a77b904b1c2e3ac24db0297f2 WatchSource:0}: Error finding container 3b4b290d1e8a096634710b277798c735abbed16a77b904b1c2e3ac24db0297f2: Status 404 returned error can't find the container with id 3b4b290d1e8a096634710b277798c735abbed16a77b904b1c2e3ac24db0297f2 Apr 16 20:12:39.813167 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.813082 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-c5znm"] Apr 16 20:12:39.847768 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.847739 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-c5znm"] Apr 16 20:12:39.847915 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.847858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:39.850728 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.850707 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 20:12:39.850728 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.850722 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:12:39.850919 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.850742 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 20:12:39.851552 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.851530 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:12:39.851552 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.851542 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:12:39.851741 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.851592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-mnvzz\"" Apr 16 20:12:39.913451 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.913413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:39.913622 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.913476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6xz\" (UniqueName: \"kubernetes.io/projected/a6a0faef-f097-4252-897d-03c4dc8946b1-kube-api-access-pm6xz\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:39.913622 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.913508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:39.913622 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:39.913538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6a0faef-f097-4252-897d-03c4dc8946b1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.014389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.014350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.014560 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.014403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6a0faef-f097-4252-897d-03c4dc8946b1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.014560 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.014437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.014560 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.014468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6xz\" (UniqueName: \"kubernetes.io/projected/a6a0faef-f097-4252-897d-03c4dc8946b1-kube-api-access-pm6xz\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.014717 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:40.014702 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 20:12:40.014783 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:40.014770 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-tls podName:a6a0faef-f097-4252-897d-03c4dc8946b1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:40.514752753 +0000 UTC m=+57.689821910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-c5znm" (UID: "a6a0faef-f097-4252-897d-03c4dc8946b1") : secret "prometheus-operator-tls" not found Apr 16 20:12:40.015194 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.015172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6a0faef-f097-4252-897d-03c4dc8946b1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.016788 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.016764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.025038 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.025016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6xz\" (UniqueName: \"kubernetes.io/projected/a6a0faef-f097-4252-897d-03c4dc8946b1-kube-api-access-pm6xz\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.517859 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.517810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.520553 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.520527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6a0faef-f097-4252-897d-03c4dc8946b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c5znm\" (UID: \"a6a0faef-f097-4252-897d-03c4dc8946b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.642941 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.642901 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bfc9bb6c9-d9tjd" event={"ID":"7656bf12-277a-4f69-8c48-491875a7c565","Type":"ContainerStarted","Data":"3b4b290d1e8a096634710b277798c735abbed16a77b904b1c2e3ac24db0297f2"} Apr 16 20:12:40.757151 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.757093 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" Apr 16 20:12:40.905916 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:40.905884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-c5znm"] Apr 16 20:12:40.909514 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:40.909472 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a0faef_f097_4252_897d_03c4dc8946b1.slice/crio-de631c1300805bc14022362785affefa574857bd475404b51e33b76c8579346d WatchSource:0}: Error finding container de631c1300805bc14022362785affefa574857bd475404b51e33b76c8579346d: Status 404 returned error can't find the container with id de631c1300805bc14022362785affefa574857bd475404b51e33b76c8579346d Apr 16 20:12:41.575451 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:41.575423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh5vc" Apr 16 20:12:41.645873 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:41.645839 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" event={"ID":"a6a0faef-f097-4252-897d-03c4dc8946b1","Type":"ContainerStarted","Data":"de631c1300805bc14022362785affefa574857bd475404b51e33b76c8579346d"} Apr 16 20:12:43.653081 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:43.653046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bfc9bb6c9-d9tjd" event={"ID":"7656bf12-277a-4f69-8c48-491875a7c565","Type":"ContainerStarted","Data":"74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391"} Apr 16 20:12:43.654773 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:43.654747 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" event={"ID":"a6a0faef-f097-4252-897d-03c4dc8946b1","Type":"ContainerStarted","Data":"8eea36617858feda5b3892bcbf3b8cb3be13fc2725e5210c944f3284b566f8c9"} Apr 16 20:12:43.654923 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:43.654778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" event={"ID":"a6a0faef-f097-4252-897d-03c4dc8946b1","Type":"ContainerStarted","Data":"c569f8e983d1f4b51baa867d4756d85ed21feb2c746f4280e06a25bcad9dbde0"} Apr 16 20:12:43.675377 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:43.675330 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bfc9bb6c9-d9tjd" podStartSLOduration=1.245188154 podStartE2EDuration="4.675314816s" podCreationTimestamp="2026-04-16 20:12:39 +0000 UTC" firstStartedPulling="2026-04-16 20:12:39.641286129 +0000 UTC m=+56.816355280" lastFinishedPulling="2026-04-16 20:12:43.071412797 +0000 UTC m=+60.246481942" observedRunningTime="2026-04-16 20:12:43.674508802 +0000 UTC m=+60.849577970" watchObservedRunningTime="2026-04-16 20:12:43.675314816 +0000 UTC m=+60.850383987" Apr 16 20:12:43.697901 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:43.697852 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-c5znm" podStartSLOduration=2.539122017 podStartE2EDuration="4.697839582s" podCreationTimestamp="2026-04-16 20:12:39 +0000 UTC" firstStartedPulling="2026-04-16 20:12:40.911984648 +0000 UTC m=+58.087053823" lastFinishedPulling="2026-04-16 20:12:43.070702243 +0000 UTC m=+60.245771388" observedRunningTime="2026-04-16 20:12:43.696245104 +0000 UTC m=+60.871314271" watchObservedRunningTime="2026-04-16 20:12:43.697839582 +0000 UTC m=+60.872908749" Apr 16 20:12:45.227080 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.227047 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kmhvt"] Apr 16 20:12:45.231267 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.231247 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.233872 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.233853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6m8jn\"" Apr 16 20:12:45.234011 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.233905 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:12:45.234011 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.233943 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:12:45.234011 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.233954 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:12:45.354706 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-textfile\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.354898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b579133-c04b-4dcc-beca-86c21e3982a1-metrics-client-ca\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.354898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-tls\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.354898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4sgn\" (UniqueName: \"kubernetes.io/projected/5b579133-c04b-4dcc-beca-86c21e3982a1-kube-api-access-x4sgn\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.354898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-root\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.354898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.355055 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-wtmp\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.355055 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-accelerators-collector-config\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.355055 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.354994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-sys\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.455934 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.455891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b579133-c04b-4dcc-beca-86c21e3982a1-metrics-client-ca\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.455934 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.455941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-tls\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.455982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4sgn\" (UniqueName: \"kubernetes.io/projected/5b579133-c04b-4dcc-beca-86c21e3982a1-kube-api-access-x4sgn\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-root\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-wtmp\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-accelerators-collector-config\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-root\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456198 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:45.456189 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:12:45.456534 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:45.456247 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-tls podName:5b579133-c04b-4dcc-beca-86c21e3982a1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.956227427 +0000 UTC m=+63.131296578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-tls") pod "node-exporter-kmhvt" (UID: "5b579133-c04b-4dcc-beca-86c21e3982a1") : secret "node-exporter-tls" not found Apr 16 20:12:45.456534 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-wtmp\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456534 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-sys\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456534 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-textfile\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456534 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456381 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b579133-c04b-4dcc-beca-86c21e3982a1-sys\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456716 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-textfile\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456716 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456668 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-accelerators-collector-config\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.456716 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.456669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b579133-c04b-4dcc-beca-86c21e3982a1-metrics-client-ca\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.458428 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.458407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.464757 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.464729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4sgn\" (UniqueName: \"kubernetes.io/projected/5b579133-c04b-4dcc-beca-86c21e3982a1-kube-api-access-x4sgn\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.960941 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.960911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-tls\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:45.963270 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:45.963251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b579133-c04b-4dcc-beca-86c21e3982a1-node-exporter-tls\") pod \"node-exporter-kmhvt\" (UID: \"5b579133-c04b-4dcc-beca-86c21e3982a1\") " pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:46.140393 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:46.140362 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kmhvt" Apr 16 20:12:46.148470 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:46.148435 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b579133_c04b_4dcc_beca_86c21e3982a1.slice/crio-0946a5260f7f255569a411c6e1420d3c31e02572312f21375319ad1da541d2ba WatchSource:0}: Error finding container 0946a5260f7f255569a411c6e1420d3c31e02572312f21375319ad1da541d2ba: Status 404 returned error can't find the container with id 0946a5260f7f255569a411c6e1420d3c31e02572312f21375319ad1da541d2ba Apr 16 20:12:46.635870 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:46.635836 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h7pxk" Apr 16 20:12:46.663051 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:46.663018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kmhvt" event={"ID":"5b579133-c04b-4dcc-beca-86c21e3982a1","Type":"ContainerStarted","Data":"0946a5260f7f255569a411c6e1420d3c31e02572312f21375319ad1da541d2ba"} Apr 16 20:12:47.667685 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:47.667651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kmhvt" event={"ID":"5b579133-c04b-4dcc-beca-86c21e3982a1","Type":"ContainerStarted","Data":"b68c119db101fc9ca916d530e7a09f72aaa1dce64e20377359d511a800491b39"} Apr 16 20:12:48.342908 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.342877 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6ffb987485-vcq2w"] Apr 16 20:12:48.361862 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.361832 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6ffb987485-vcq2w"] Apr 16 20:12:48.362016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.361968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.364546 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.364520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 20:12:48.365064 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.365044 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 20:12:48.365168 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.365144 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-6p9gn\"" Apr 16 20:12:48.365283 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.365215 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 20:12:48.365542 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.365522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dj13i78uscbbh\"" Apr 16 20:12:48.365659 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.365603 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 20:12:48.365744 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.365731 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 20:12:48.481980 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.481942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.481990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.482011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-grpc-tls\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.482041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-tls\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.482061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.482082 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/723b9f65-5d9e-43f5-86f3-1e0655fdc449-metrics-client-ca\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482341 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.482173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gcb\" (UniqueName: \"kubernetes.io/projected/723b9f65-5d9e-43f5-86f3-1e0655fdc449-kube-api-access-26gcb\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.482341 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.482200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.582765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.582728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-tls\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.582765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.582769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.582986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.582795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/723b9f65-5d9e-43f5-86f3-1e0655fdc449-metrics-client-ca\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.582986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.582848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gcb\" (UniqueName: \"kubernetes.io/projected/723b9f65-5d9e-43f5-86f3-1e0655fdc449-kube-api-access-26gcb\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.582986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.582874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.582986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.582961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.583217 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.583012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.583217 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.583038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-grpc-tls\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.583655 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.583613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/723b9f65-5d9e-43f5-86f3-1e0655fdc449-metrics-client-ca\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.585621 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.585598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.585772 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.585750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-grpc-tls\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.585889 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.585874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.585969 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.585929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.586085 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.586065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-tls\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.586339 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.586320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/723b9f65-5d9e-43f5-86f3-1e0655fdc449-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.594021 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.593975 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gcb\" (UniqueName: \"kubernetes.io/projected/723b9f65-5d9e-43f5-86f3-1e0655fdc449-kube-api-access-26gcb\") pod \"thanos-querier-6ffb987485-vcq2w\" (UID: \"723b9f65-5d9e-43f5-86f3-1e0655fdc449\") " pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.671234 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.671189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:48.672553 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.672524 2574 generic.go:358] "Generic (PLEG): container finished" podID="5b579133-c04b-4dcc-beca-86c21e3982a1" containerID="b68c119db101fc9ca916d530e7a09f72aaa1dce64e20377359d511a800491b39" exitCode=0 Apr 16 20:12:48.672633 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.672570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kmhvt" event={"ID":"5b579133-c04b-4dcc-beca-86c21e3982a1","Type":"ContainerDied","Data":"b68c119db101fc9ca916d530e7a09f72aaa1dce64e20377359d511a800491b39"} Apr 16 20:12:48.803463 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:48.803432 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6ffb987485-vcq2w"] Apr 16 20:12:48.806650 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:48.806624 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723b9f65_5d9e_43f5_86f3_1e0655fdc449.slice/crio-143942eb3199816be4830bbfd5d7f0288b914f27067abafea82650fff4699f86 WatchSource:0}: Error finding container 143942eb3199816be4830bbfd5d7f0288b914f27067abafea82650fff4699f86: Status 404 returned error can't find the container with id 143942eb3199816be4830bbfd5d7f0288b914f27067abafea82650fff4699f86 Apr 16 20:12:49.187582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.187493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:49.190243 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.190213 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:49.200765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.200735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/604b143f-56b9-4ff2-a025-f1f904de0066-metrics-certs\") pod \"network-metrics-daemon-gmj69\" (UID: \"604b143f-56b9-4ff2-a025-f1f904de0066\") " pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:49.288443 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.288403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:49.290857 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.290821 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:49.301139 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.301102 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:49.311712 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.311688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zf4\" (UniqueName: \"kubernetes.io/projected/3251e838-9ac0-43bc-88bb-3f2002d4ad60-kube-api-access-s5zf4\") pod \"network-check-target-nhjrd\" (UID: \"3251e838-9ac0-43bc-88bb-3f2002d4ad60\") " pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:49.418754 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.418722 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b2xrn\"" Apr 16 20:12:49.424962 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.424935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm2hn\"" Apr 16 20:12:49.426881 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.426862 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmj69" Apr 16 20:12:49.433600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.433575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:49.521330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.521302 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:49.522416 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.521964 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:49.528294 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.528261 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:49.570293 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.570259 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nhjrd"] Apr 16 20:12:49.573682 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:49.573650 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3251e838_9ac0_43bc_88bb_3f2002d4ad60.slice/crio-4f4e242ac10f3e1d6ea9c85bb088f0e6b1afee571dc8bbd1bbc3f38f46938b89 WatchSource:0}: Error finding container 4f4e242ac10f3e1d6ea9c85bb088f0e6b1afee571dc8bbd1bbc3f38f46938b89: Status 404 returned error can't find the container with id 4f4e242ac10f3e1d6ea9c85bb088f0e6b1afee571dc8bbd1bbc3f38f46938b89 Apr 16 20:12:49.587748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.587347 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gmj69"] Apr 16 20:12:49.590700 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:49.590662 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod604b143f_56b9_4ff2_a025_f1f904de0066.slice/crio-303f517d80cd68a1d45518ee29a9041f49065f582c34e08727e7eab0602c82e8 WatchSource:0}: Error finding container 303f517d80cd68a1d45518ee29a9041f49065f582c34e08727e7eab0602c82e8: Status 404 returned error can't find the container with id 303f517d80cd68a1d45518ee29a9041f49065f582c34e08727e7eab0602c82e8 Apr 16 20:12:49.676791 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.676750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"143942eb3199816be4830bbfd5d7f0288b914f27067abafea82650fff4699f86"} Apr 16 20:12:49.677864 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.677829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nhjrd" event={"ID":"3251e838-9ac0-43bc-88bb-3f2002d4ad60","Type":"ContainerStarted","Data":"4f4e242ac10f3e1d6ea9c85bb088f0e6b1afee571dc8bbd1bbc3f38f46938b89"} Apr 16 20:12:49.679074 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.679052 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmj69" event={"ID":"604b143f-56b9-4ff2-a025-f1f904de0066","Type":"ContainerStarted","Data":"303f517d80cd68a1d45518ee29a9041f49065f582c34e08727e7eab0602c82e8"} Apr 16 20:12:49.681146 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.681098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kmhvt" event={"ID":"5b579133-c04b-4dcc-beca-86c21e3982a1","Type":"ContainerStarted","Data":"f6bb92124cb36a3a922f1d72b44534ca23b58078ec6b67ada0a3b6e3c9c61b13"} Apr 16 20:12:49.681260 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.681151 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kmhvt" event={"ID":"5b579133-c04b-4dcc-beca-86c21e3982a1","Type":"ContainerStarted","Data":"06e294aa3387084f7c8603796597a6a58577032a0504c90b06a7cec2a18c5adc"} Apr 16 20:12:49.686876 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.686853 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:12:49.700683 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.700594 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kmhvt" podStartSLOduration=3.392465312 podStartE2EDuration="4.700581747s" podCreationTimestamp="2026-04-16 20:12:45 +0000 UTC" firstStartedPulling="2026-04-16 20:12:46.150589601 +0000 UTC m=+63.325658750" lastFinishedPulling="2026-04-16 20:12:47.458706039 +0000 UTC m=+64.633775185" observedRunningTime="2026-04-16 20:12:49.700138552 +0000 UTC m=+66.875207720" watchObservedRunningTime="2026-04-16 20:12:49.700581747 +0000 UTC m=+66.875650914" Apr 16 20:12:49.705906 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.705879 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-766f745c7-bkbk7"] Apr 16 20:12:49.710537 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.710469 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.713031 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.713008 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 20:12:49.713507 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.713353 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 20:12:49.713507 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.713408 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 20:12:49.713507 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.713436 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6ign95co3me0g\"" Apr 16 20:12:49.713696 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.713556 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jvn2t\"" Apr 16 20:12:49.713981 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.713961 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:12:49.720626 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.720606 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-766f745c7-bkbk7"] Apr 16 20:12:49.794762 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.794723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-secret-metrics-server-tls\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.794922 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.794774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2870419-823d-4e89-99b0-90e1ff3cba57-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.794922 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.794813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldzbp\" (UniqueName: \"kubernetes.io/projected/b2870419-823d-4e89-99b0-90e1ff3cba57-kube-api-access-ldzbp\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.794922 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.794916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b2870419-823d-4e89-99b0-90e1ff3cba57-audit-log\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.795092 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.794963 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b2870419-823d-4e89-99b0-90e1ff3cba57-metrics-server-audit-profiles\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.795092 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.795051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-client-ca-bundle\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.795232 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.795210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-secret-metrics-server-client-certs\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.895977 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.895937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-client-ca-bundle\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.896203 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.895999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-secret-metrics-server-client-certs\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.896203 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.896162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-secret-metrics-server-tls\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.896321 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.896219 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2870419-823d-4e89-99b0-90e1ff3cba57-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.896321 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.896247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldzbp\" (UniqueName: \"kubernetes.io/projected/b2870419-823d-4e89-99b0-90e1ff3cba57-kube-api-access-ldzbp\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.896321 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.896301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b2870419-823d-4e89-99b0-90e1ff3cba57-audit-log\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.896469 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.896332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b2870419-823d-4e89-99b0-90e1ff3cba57-metrics-server-audit-profiles\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.897458 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.897402 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b2870419-823d-4e89-99b0-90e1ff3cba57-metrics-server-audit-profiles\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.897458 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.897439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b2870419-823d-4e89-99b0-90e1ff3cba57-audit-log\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.897825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.897803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2870419-823d-4e89-99b0-90e1ff3cba57-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.898895 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.898872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-secret-metrics-server-tls\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.899004 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.898922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-client-ca-bundle\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.899058 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.899027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b2870419-823d-4e89-99b0-90e1ff3cba57-secret-metrics-server-client-certs\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.905141 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.905117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldzbp\" (UniqueName: \"kubernetes.io/projected/b2870419-823d-4e89-99b0-90e1ff3cba57-kube-api-access-ldzbp\") pod \"metrics-server-766f745c7-bkbk7\" (UID: \"b2870419-823d-4e89-99b0-90e1ff3cba57\") " pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:49.981995 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.981906 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz"] Apr 16 20:12:49.985267 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.985236 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:49.987431 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.987404 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 20:12:49.987431 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.987418 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dfvts\"" Apr 16 20:12:49.992172 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:49.992127 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz"] Apr 16 20:12:50.024461 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.024425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:12:50.097427 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.097393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/814dcf6b-d546-4e89-ba47-3f454a8db7ad-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8ttxz\" (UID: \"814dcf6b-d546-4e89-ba47-3f454a8db7ad\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:50.198899 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.198854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/814dcf6b-d546-4e89-ba47-3f454a8db7ad-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8ttxz\" (UID: \"814dcf6b-d546-4e89-ba47-3f454a8db7ad\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:50.199074 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:50.198984 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 20:12:50.199074 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:12:50.199059 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/814dcf6b-d546-4e89-ba47-3f454a8db7ad-monitoring-plugin-cert podName:814dcf6b-d546-4e89-ba47-3f454a8db7ad nodeName:}" failed. No retries permitted until 2026-04-16 20:12:50.699039468 +0000 UTC m=+67.874108629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/814dcf6b-d546-4e89-ba47-3f454a8db7ad-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-8ttxz" (UID: "814dcf6b-d546-4e89-ba47-3f454a8db7ad") : secret "monitoring-plugin-cert" not found Apr 16 20:12:50.649867 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.649831 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64dc664459-q7jc9"] Apr 16 20:12:50.653058 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.653039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.661857 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.661820 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 20:12:50.663406 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.663372 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64dc664459-q7jc9"] Apr 16 20:12:50.703429 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.703391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/814dcf6b-d546-4e89-ba47-3f454a8db7ad-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8ttxz\" (UID: \"814dcf6b-d546-4e89-ba47-3f454a8db7ad\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:50.706517 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.706487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/814dcf6b-d546-4e89-ba47-3f454a8db7ad-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8ttxz\" (UID: \"814dcf6b-d546-4e89-ba47-3f454a8db7ad\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:50.804835 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.804792 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-oauth-serving-cert\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.805025 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.804852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-trusted-ca-bundle\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.805025 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.804930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-serving-cert\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.805025 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.804982 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-oauth-config\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.805212 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.805027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl2mr\" (UniqueName: \"kubernetes.io/projected/dc6a280e-594f-4385-8810-45fc682a3aff-kube-api-access-bl2mr\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.805328 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.805304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-console-config\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.805414 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.805397 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-service-ca\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.898129 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.898072 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:50.906097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-trusted-ca-bundle\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.906097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-serving-cert\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.906327 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-oauth-config\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.906381 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl2mr\" (UniqueName: \"kubernetes.io/projected/dc6a280e-594f-4385-8810-45fc682a3aff-kube-api-access-bl2mr\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.906417 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-console-config\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.906468 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-service-ca\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.906515 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.906502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-oauth-serving-cert\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.907207 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.907072 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-trusted-ca-bundle\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.907207 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.907135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-oauth-serving-cert\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.907388 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.907308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-console-config\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.907575 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.907550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-service-ca\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.909148 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.909127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-oauth-config\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.909275 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.909257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-serving-cert\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.914686 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.914665 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl2mr\" (UniqueName: \"kubernetes.io/projected/dc6a280e-594f-4385-8810-45fc682a3aff-kube-api-access-bl2mr\") pod \"console-64dc664459-q7jc9\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:50.965265 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:50.965198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:12:51.131550 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.131518 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz"] Apr 16 20:12:51.137493 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:51.137460 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814dcf6b_d546_4e89_ba47_3f454a8db7ad.slice/crio-791f700a48372df1d47f761d54a3ec2b49224cde0ca017ad3bf1fc3195c0c9ea WatchSource:0}: Error finding container 791f700a48372df1d47f761d54a3ec2b49224cde0ca017ad3bf1fc3195c0c9ea: Status 404 returned error can't find the container with id 791f700a48372df1d47f761d54a3ec2b49224cde0ca017ad3bf1fc3195c0c9ea Apr 16 20:12:51.154642 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.153279 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-766f745c7-bkbk7"] Apr 16 20:12:51.162788 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:51.162746 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2870419_823d_4e89_99b0_90e1ff3cba57.slice/crio-d6f91f5efc0d23416c46e0369ec74bd1f441bfbc3adc686033f5afb6fda7de76 WatchSource:0}: Error finding container d6f91f5efc0d23416c46e0369ec74bd1f441bfbc3adc686033f5afb6fda7de76: Status 404 returned error can't find the container with id d6f91f5efc0d23416c46e0369ec74bd1f441bfbc3adc686033f5afb6fda7de76 Apr 16 20:12:51.177831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.169172 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64dc664459-q7jc9"] Apr 16 20:12:51.177831 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:51.177199 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6a280e_594f_4385_8810_45fc682a3aff.slice/crio-8879593f1e5fbbc610d7a5ea9d4f2d2e0e5cdbc3a65b2caea1d2e4f1f0872f05 WatchSource:0}: Error finding container 8879593f1e5fbbc610d7a5ea9d4f2d2e0e5cdbc3a65b2caea1d2e4f1f0872f05: Status 404 returned error can't find the container with id 8879593f1e5fbbc610d7a5ea9d4f2d2e0e5cdbc3a65b2caea1d2e4f1f0872f05 Apr 16 20:12:51.689623 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.689592 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:12:51.693846 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.693815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.694844 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.694818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmj69" event={"ID":"604b143f-56b9-4ff2-a025-f1f904de0066","Type":"ContainerStarted","Data":"75c5698c8efa9eb49d8685df9e3e58aefb239ed9c2e8417ef6a8270941ac098d"} Apr 16 20:12:51.694986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.694970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmj69" event={"ID":"604b143f-56b9-4ff2-a025-f1f904de0066","Type":"ContainerStarted","Data":"7a5df13e593ab5930080d9ce80ef702cae3e9904421e4943b8f702ffb98f7229"} Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4brmovh2s1os\"" Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699620 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699812 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699816 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699871 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:12:51.699967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.699876 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:12:51.700854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.700001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:12:51.700854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.700583 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:12:51.701592 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701020 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-zdsnm\"" Apr 16 20:12:51.701592 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701191 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:12:51.701592 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"89057ff3ffe0d1ad80b38010825ecbdf8ff7f902958ec55daf434341e3156aa9"} Apr 16 20:12:51.701592 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"f319ced7d5cf8caee4f18e2cbfb6e8dd700a5d7c2366bd274e40470271dd3df6"} Apr 16 20:12:51.701592 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701454 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"3cab69c72d0b19fdf0a9c24aeac24b83af036e2017ae76a8c1ef38b86863d029"} Apr 16 20:12:51.701592 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701538 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:12:51.702362 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701645 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:12:51.702362 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.701736 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:12:51.706710 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.705807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64dc664459-q7jc9" event={"ID":"dc6a280e-594f-4385-8810-45fc682a3aff","Type":"ContainerStarted","Data":"a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de"} Apr 16 20:12:51.706710 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.705839 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64dc664459-q7jc9" event={"ID":"dc6a280e-594f-4385-8810-45fc682a3aff","Type":"ContainerStarted","Data":"8879593f1e5fbbc610d7a5ea9d4f2d2e0e5cdbc3a65b2caea1d2e4f1f0872f05"} Apr 16 20:12:51.707519 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.707482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" event={"ID":"814dcf6b-d546-4e89-ba47-3f454a8db7ad","Type":"ContainerStarted","Data":"791f700a48372df1d47f761d54a3ec2b49224cde0ca017ad3bf1fc3195c0c9ea"} Apr 16 20:12:51.710599 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.710577 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:12:51.711179 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.711155 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" event={"ID":"b2870419-823d-4e89-99b0-90e1ff3cba57","Type":"ContainerStarted","Data":"d6f91f5efc0d23416c46e0369ec74bd1f441bfbc3adc686033f5afb6fda7de76"} Apr 16 20:12:51.761675 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.761165 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64dc664459-q7jc9" podStartSLOduration=1.7599662600000001 podStartE2EDuration="1.75996626s" podCreationTimestamp="2026-04-16 20:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:51.75882742 +0000 UTC m=+68.933896601" watchObservedRunningTime="2026-04-16 20:12:51.75996626 +0000 UTC m=+68.935035430" Apr 16 20:12:51.777198 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.777097 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gmj69" podStartSLOduration=67.417031257 podStartE2EDuration="1m8.777075371s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:12:49.593249261 +0000 UTC m=+66.768318413" lastFinishedPulling="2026-04-16 20:12:50.953293377 +0000 UTC m=+68.128362527" observedRunningTime="2026-04-16 20:12:51.775552643 +0000 UTC m=+68.950621813" watchObservedRunningTime="2026-04-16 20:12:51.777075371 +0000 UTC m=+68.952144539" Apr 16 20:12:51.823313 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.822826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.823488 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.823368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.823859 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.823833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.824748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.824001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.824748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.824142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.824748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.824350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825444 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825444 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825444 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tslwf\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-kube-api-access-tslwf\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825444 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825444 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825719 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.825719 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.827825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.825996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.827825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.826388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.827825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.826611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.827825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.826652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config-out\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.827825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.826675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-web-config\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config-out\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-web-config\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tslwf\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-kube-api-access-tslwf\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.927993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.928026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.928052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.929091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.928086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.930217 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.928130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.930217 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.928930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.931314 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.931274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.933597 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.932294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.933597 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.932876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.938722 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.938179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.939349 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.938978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.940495 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.939957 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.940495 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.940006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.943255 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.942933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.946646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.946076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.946646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.946595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.947065 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.947023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tslwf\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-kube-api-access-tslwf\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.948516 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.948351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config-out\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.948516 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.948439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-web-config\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.948692 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.948630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.948980 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.948959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.950794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.950720 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:51.957969 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:51.953265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:52.034830 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:52.034788 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:12:54.163856 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.163811 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:12:54.168592 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:54.168548 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2440bb23_1627_4bc0_8d2b_50cbf4c470e4.slice/crio-d2bade7a677533964263a9264d4bc75920c9e1f991fbc7277b2b8e35c940bfff WatchSource:0}: Error finding container d2bade7a677533964263a9264d4bc75920c9e1f991fbc7277b2b8e35c940bfff: Status 404 returned error can't find the container with id d2bade7a677533964263a9264d4bc75920c9e1f991fbc7277b2b8e35c940bfff Apr 16 20:12:54.726483 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.726441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"d2bade7a677533964263a9264d4bc75920c9e1f991fbc7277b2b8e35c940bfff"} Apr 16 20:12:54.730184 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.730150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"1eeed3f8b0bf284d4da058bebccc1c32b2eba29d7cb215d638a203dc5ec26a75"} Apr 16 20:12:54.730184 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.730188 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"8af1261b98512936df4550e2dc5000ab71d6746f58a04d42803e4494a52ad176"} Apr 16 20:12:54.730410 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.730202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" event={"ID":"723b9f65-5d9e-43f5-86f3-1e0655fdc449","Type":"ContainerStarted","Data":"a6d4f45644e3df9f07c0ad2d2e60abc0e6e03740cf5663765a4879d5737d327f"} Apr 16 20:12:54.730410 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.730348 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:12:54.731662 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.731633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" event={"ID":"814dcf6b-d546-4e89-ba47-3f454a8db7ad","Type":"ContainerStarted","Data":"78c260caf10b4b2a562469ca3c1cefefcce091a15b5a8ae1cf8cff64e650cd94"} Apr 16 20:12:54.732022 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.732003 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:54.733645 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.733617 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" event={"ID":"b2870419-823d-4e89-99b0-90e1ff3cba57","Type":"ContainerStarted","Data":"da9dc891b4ce7dd1e2bbe8be4f10e0682d47da20f1f852ce6f5f08b71e6104db"} Apr 16 20:12:54.737305 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.737273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nhjrd" event={"ID":"3251e838-9ac0-43bc-88bb-3f2002d4ad60","Type":"ContainerStarted","Data":"e2ba5924e751c7b4ef2c3f9443a3622ee6bb10b5d6c36ad42683d3ba55577081"} Apr 16 20:12:54.737491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.737472 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:12:54.740070 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.740048 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" Apr 16 20:12:54.753864 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.753815 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" podStartSLOduration=1.565797249 podStartE2EDuration="6.753802416s" podCreationTimestamp="2026-04-16 20:12:48 +0000 UTC" firstStartedPulling="2026-04-16 20:12:48.808501337 +0000 UTC m=+65.983570487" lastFinishedPulling="2026-04-16 20:12:53.996506495 +0000 UTC m=+71.171575654" observedRunningTime="2026-04-16 20:12:54.753209284 +0000 UTC m=+71.928278453" watchObservedRunningTime="2026-04-16 20:12:54.753802416 +0000 UTC m=+71.928871584" Apr 16 20:12:54.772316 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.771203 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" podStartSLOduration=2.936492928 podStartE2EDuration="5.771182076s" podCreationTimestamp="2026-04-16 20:12:49 +0000 UTC" firstStartedPulling="2026-04-16 20:12:51.165185947 +0000 UTC m=+68.340255099" lastFinishedPulling="2026-04-16 20:12:53.999875086 +0000 UTC m=+71.174944247" observedRunningTime="2026-04-16 20:12:54.769391146 +0000 UTC m=+71.944460315" watchObservedRunningTime="2026-04-16 20:12:54.771182076 +0000 UTC m=+71.946251245" Apr 16 20:12:54.785450 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.785393 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nhjrd" podStartSLOduration=67.365308919 podStartE2EDuration="1m11.785378234s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:12:49.576381828 +0000 UTC m=+66.751450975" lastFinishedPulling="2026-04-16 20:12:53.996451129 +0000 UTC m=+71.171520290" observedRunningTime="2026-04-16 20:12:54.784034111 +0000 UTC m=+71.959103285" watchObservedRunningTime="2026-04-16 20:12:54.785378234 +0000 UTC m=+71.960447402" Apr 16 20:12:54.797828 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:54.797766 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8ttxz" podStartSLOduration=2.939938089 podStartE2EDuration="5.797748492s" podCreationTimestamp="2026-04-16 20:12:49 +0000 UTC" firstStartedPulling="2026-04-16 20:12:51.14029104 +0000 UTC m=+68.315360191" lastFinishedPulling="2026-04-16 20:12:53.998101444 +0000 UTC m=+71.173170594" observedRunningTime="2026-04-16 20:12:54.797161121 +0000 UTC m=+71.972230290" watchObservedRunningTime="2026-04-16 20:12:54.797748492 +0000 UTC m=+71.972817661" Apr 16 20:12:55.740860 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:55.740824 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c" exitCode=0 Apr 16 20:12:55.741327 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:55.740931 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c"} Apr 16 20:12:57.382308 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:57.382260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:57.384683 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:57.384662 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:57.394899 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:57.394873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a8f93c5-d267-47b5-a685-fc5bd8269d88-original-pull-secret\") pod \"global-pull-secret-syncer-tjktw\" (UID: \"9a8f93c5-d267-47b5-a685-fc5bd8269d88\") " pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:57.507456 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:57.507409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tjktw" Apr 16 20:12:57.646175 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:57.646072 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tjktw"] Apr 16 20:12:57.649569 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:12:57.649539 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8f93c5_d267_47b5_a685_fc5bd8269d88.slice/crio-e0785ba25f8bd216690dea27132a832126706937513c2d02f6ed85f0825f3647 WatchSource:0}: Error finding container e0785ba25f8bd216690dea27132a832126706937513c2d02f6ed85f0825f3647: Status 404 returned error can't find the container with id e0785ba25f8bd216690dea27132a832126706937513c2d02f6ed85f0825f3647 Apr 16 20:12:57.749326 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:57.749294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tjktw" event={"ID":"9a8f93c5-d267-47b5-a685-fc5bd8269d88","Type":"ContainerStarted","Data":"e0785ba25f8bd216690dea27132a832126706937513c2d02f6ed85f0825f3647"} Apr 16 20:12:59.759399 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.759364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e"} Apr 16 20:12:59.759399 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.759403 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff"} Apr 16 20:12:59.759911 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.759415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875"} Apr 16 20:12:59.759911 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.759427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3"} Apr 16 20:12:59.759911 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.759439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12"} Apr 16 20:12:59.759911 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.759450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerStarted","Data":"30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23"} Apr 16 20:12:59.798084 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:12:59.798018 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.266116636 podStartE2EDuration="8.798001123s" podCreationTimestamp="2026-04-16 20:12:51 +0000 UTC" firstStartedPulling="2026-04-16 20:12:54.172051156 +0000 UTC m=+71.347120310" lastFinishedPulling="2026-04-16 20:12:58.703935649 +0000 UTC m=+75.879004797" observedRunningTime="2026-04-16 20:12:59.790127712 +0000 UTC m=+76.965196881" watchObservedRunningTime="2026-04-16 20:12:59.798001123 +0000 UTC m=+76.973070289" Apr 16 20:13:00.747614 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:00.747587 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6ffb987485-vcq2w" Apr 16 20:13:00.965894 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:00.965850 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:13:00.966340 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:00.965907 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:13:00.971352 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:00.971320 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:13:01.770953 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:01.770916 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:13:01.821365 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:01.821330 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bfc9bb6c9-d9tjd"] Apr 16 20:13:02.036003 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:02.035839 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:13:02.773243 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:02.773202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tjktw" event={"ID":"9a8f93c5-d267-47b5-a685-fc5bd8269d88","Type":"ContainerStarted","Data":"addbaec0fa85e7a26dfa8bdb020a4898612592d698e51af93557201c195afe0c"} Apr 16 20:13:02.789345 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:02.789303 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tjktw" podStartSLOduration=65.044796582 podStartE2EDuration="1m9.789290788s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:12:57.651803392 +0000 UTC m=+74.826872552" lastFinishedPulling="2026-04-16 20:13:02.396297609 +0000 UTC m=+79.571366758" observedRunningTime="2026-04-16 20:13:02.787413265 +0000 UTC m=+79.962482433" watchObservedRunningTime="2026-04-16 20:13:02.789290788 +0000 UTC m=+79.964359955" Apr 16 20:13:10.025535 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:10.025482 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:13:10.025535 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:10.025545 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:13:25.744039 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:25.744008 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nhjrd" Apr 16 20:13:26.844433 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:26.844377 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7bfc9bb6c9-d9tjd" podUID="7656bf12-277a-4f69-8c48-491875a7c565" containerName="console" containerID="cri-o://74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391" gracePeriod=15 Apr 16 20:13:27.082162 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.082142 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bfc9bb6c9-d9tjd_7656bf12-277a-4f69-8c48-491875a7c565/console/0.log" Apr 16 20:13:27.082273 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.082212 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:13:27.124342 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124270 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-oauth-config\") pod \"7656bf12-277a-4f69-8c48-491875a7c565\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " Apr 16 20:13:27.124342 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124306 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd948\" (UniqueName: \"kubernetes.io/projected/7656bf12-277a-4f69-8c48-491875a7c565-kube-api-access-hd948\") pod \"7656bf12-277a-4f69-8c48-491875a7c565\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " Apr 16 20:13:27.124342 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124341 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-service-ca\") pod \"7656bf12-277a-4f69-8c48-491875a7c565\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " Apr 16 20:13:27.124589 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124356 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-console-config\") pod \"7656bf12-277a-4f69-8c48-491875a7c565\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " Apr 16 20:13:27.124589 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124373 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-serving-cert\") pod \"7656bf12-277a-4f69-8c48-491875a7c565\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " Apr 16 20:13:27.124589 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124392 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-oauth-serving-cert\") pod \"7656bf12-277a-4f69-8c48-491875a7c565\" (UID: \"7656bf12-277a-4f69-8c48-491875a7c565\") " Apr 16 20:13:27.124765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124737 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-service-ca" (OuterVolumeSpecName: "service-ca") pod "7656bf12-277a-4f69-8c48-491875a7c565" (UID: "7656bf12-277a-4f69-8c48-491875a7c565"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:27.124825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124777 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-console-config" (OuterVolumeSpecName: "console-config") pod "7656bf12-277a-4f69-8c48-491875a7c565" (UID: "7656bf12-277a-4f69-8c48-491875a7c565"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:27.124870 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.124845 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7656bf12-277a-4f69-8c48-491875a7c565" (UID: "7656bf12-277a-4f69-8c48-491875a7c565"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:27.126735 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.126702 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7656bf12-277a-4f69-8c48-491875a7c565" (UID: "7656bf12-277a-4f69-8c48-491875a7c565"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:27.126735 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.126724 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7656bf12-277a-4f69-8c48-491875a7c565-kube-api-access-hd948" (OuterVolumeSpecName: "kube-api-access-hd948") pod "7656bf12-277a-4f69-8c48-491875a7c565" (UID: "7656bf12-277a-4f69-8c48-491875a7c565"). InnerVolumeSpecName "kube-api-access-hd948". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:27.126848 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.126760 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7656bf12-277a-4f69-8c48-491875a7c565" (UID: "7656bf12-277a-4f69-8c48-491875a7c565"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:27.225350 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.225313 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-oauth-config\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:13:27.225350 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.225346 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hd948\" (UniqueName: \"kubernetes.io/projected/7656bf12-277a-4f69-8c48-491875a7c565-kube-api-access-hd948\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:13:27.225350 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.225357 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-service-ca\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:13:27.225623 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.225366 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-console-config\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:13:27.225623 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.225375 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7656bf12-277a-4f69-8c48-491875a7c565-console-serving-cert\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:13:27.225623 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.225383 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7656bf12-277a-4f69-8c48-491875a7c565-oauth-serving-cert\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:13:27.850069 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.850040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bfc9bb6c9-d9tjd_7656bf12-277a-4f69-8c48-491875a7c565/console/0.log" Apr 16 20:13:27.850511 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.850079 2574 generic.go:358] "Generic (PLEG): container finished" podID="7656bf12-277a-4f69-8c48-491875a7c565" containerID="74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391" exitCode=2 Apr 16 20:13:27.850511 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.850148 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bfc9bb6c9-d9tjd" event={"ID":"7656bf12-277a-4f69-8c48-491875a7c565","Type":"ContainerDied","Data":"74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391"} Apr 16 20:13:27.850511 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.850181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bfc9bb6c9-d9tjd" event={"ID":"7656bf12-277a-4f69-8c48-491875a7c565","Type":"ContainerDied","Data":"3b4b290d1e8a096634710b277798c735abbed16a77b904b1c2e3ac24db0297f2"} Apr 16 20:13:27.850511 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.850182 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bfc9bb6c9-d9tjd" Apr 16 20:13:27.850511 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.850196 2574 scope.go:117] "RemoveContainer" containerID="74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391" Apr 16 20:13:27.857993 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.857976 2574 scope.go:117] "RemoveContainer" containerID="74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391" Apr 16 20:13:27.858289 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:13:27.858265 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391\": container with ID starting with 74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391 not found: ID does not exist" containerID="74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391" Apr 16 20:13:27.858375 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.858296 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391"} err="failed to get container status \"74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391\": rpc error: code = NotFound desc = could not find container \"74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391\": container with ID starting with 74e01306bc4e6799a544e0b866913d1eba0aa5ebd8b96393b844a221bbd21391 not found: ID does not exist" Apr 16 20:13:27.870458 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.870423 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bfc9bb6c9-d9tjd"] Apr 16 20:13:27.873355 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:27.873333 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bfc9bb6c9-d9tjd"] Apr 16 20:13:29.396989 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:29.396956 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7656bf12-277a-4f69-8c48-491875a7c565" path="/var/lib/kubelet/pods/7656bf12-277a-4f69-8c48-491875a7c565/volumes" Apr 16 20:13:30.030631 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:30.030605 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:13:30.034699 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:30.034674 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-766f745c7-bkbk7" Apr 16 20:13:52.035450 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:52.035397 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:13:52.054477 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:52.054438 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:13:52.936778 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:13:52.936752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:10.184486 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.184448 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:10.185066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.184929 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="prometheus" containerID="cri-o://30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23" gracePeriod=600 Apr 16 20:14:10.185066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.184955 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy" containerID="cri-o://e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff" gracePeriod=600 Apr 16 20:14:10.185066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.184987 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="thanos-sidecar" containerID="cri-o://e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3" gracePeriod=600 Apr 16 20:14:10.185066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.185014 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-web" containerID="cri-o://bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875" gracePeriod=600 Apr 16 20:14:10.185066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.184975 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e" gracePeriod=600 Apr 16 20:14:10.185353 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.185069 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="config-reloader" containerID="cri-o://d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12" gracePeriod=600 Apr 16 20:14:10.973687 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973656 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e" exitCode=0 Apr 16 20:14:10.973687 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973681 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff" exitCode=0 Apr 16 20:14:10.973687 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973690 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3" exitCode=0 Apr 16 20:14:10.973687 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973696 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12" exitCode=0 Apr 16 20:14:10.973951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973703 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23" exitCode=0 Apr 16 20:14:10.973951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973731 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e"} Apr 16 20:14:10.973951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff"} Apr 16 20:14:10.973951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973768 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3"} Apr 16 20:14:10.973951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12"} Apr 16 20:14:10.973951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:10.973785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23"} Apr 16 20:14:11.423295 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.423276 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:11.491916 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.491888 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-serving-certs-ca-bundle\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492080 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.491926 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-metrics-client-ca\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492080 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.491963 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config-out\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492080 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492072 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492101 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-trusted-ca-bundle\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492131 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-kube-rbac-proxy\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492156 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-tls-assets\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492182 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-rulefiles-0\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492206 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tslwf\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-kube-api-access-tslwf\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492234 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492273 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-metrics-client-certs\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492327 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-thanos-prometheus-http-client-file\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492352 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-kubelet-serving-ca-bundle\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492376 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-grpc-tls\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492403 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:11.492566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.492411 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:11.493361 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.493329 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:11.493988 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.493963 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:11.495796 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.495765 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config" (OuterVolumeSpecName: "config") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.496965 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.496927 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497124 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-web-config\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497187 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-tls\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497253 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-db\") pod \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\" (UID: \"2440bb23-1627-4bc0-8d2b-50cbf4c470e4\") " Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497368 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497641 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497663 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-metrics-client-ca\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497680 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497695 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-tls-assets\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497717 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497730 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.497744 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.498541 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.498342 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.499378 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.496854 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config-out" (OuterVolumeSpecName: "config-out") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:11.499728 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.499637 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.503024 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.500939 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:11.503024 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.500975 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.503597 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.503466 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-kube-api-access-tslwf" (OuterVolumeSpecName: "kube-api-access-tslwf") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "kube-api-access-tslwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:11.504279 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.504248 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.504670 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.504515 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.504782 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.504761 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.504974 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.504944 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.515195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.515172 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-web-config" (OuterVolumeSpecName: "web-config") pod "2440bb23-1627-4bc0-8d2b-50cbf4c470e4" (UID: "2440bb23-1627-4bc0-8d2b-50cbf4c470e4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:11.598999 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.598964 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-config-out\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.598999 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.598997 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599013 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-kube-rbac-proxy\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599027 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tslwf\" (UniqueName: \"kubernetes.io/projected/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-kube-api-access-tslwf\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599039 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599051 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-metrics-client-certs\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599064 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599077 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-grpc-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599088 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-web-config\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599099 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.599237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.599129 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2440bb23-1627-4bc0-8d2b-50cbf4c470e4-prometheus-k8s-db\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:11.978993 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.978891 2574 generic.go:358] "Generic (PLEG): container finished" podID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerID="bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875" exitCode=0 Apr 16 20:14:11.978993 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.978946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875"} Apr 16 20:14:11.978993 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.978979 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2440bb23-1627-4bc0-8d2b-50cbf4c470e4","Type":"ContainerDied","Data":"d2bade7a677533964263a9264d4bc75920c9e1f991fbc7277b2b8e35c940bfff"} Apr 16 20:14:11.979268 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.979004 2574 scope.go:117] "RemoveContainer" containerID="9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e" Apr 16 20:14:11.979268 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.979037 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:11.986639 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.986619 2574 scope.go:117] "RemoveContainer" containerID="e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff" Apr 16 20:14:11.992949 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.992932 2574 scope.go:117] "RemoveContainer" containerID="bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875" Apr 16 20:14:11.999094 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:11.999076 2574 scope.go:117] "RemoveContainer" containerID="e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3" Apr 16 20:14:12.005158 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.005090 2574 scope.go:117] "RemoveContainer" containerID="d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12" Apr 16 20:14:12.007470 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.007445 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:12.011695 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.011679 2574 scope.go:117] "RemoveContainer" containerID="30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23" Apr 16 20:14:12.015755 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.015735 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:12.018429 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.018412 2574 scope.go:117] "RemoveContainer" containerID="323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c" Apr 16 20:14:12.024091 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024074 2574 scope.go:117] "RemoveContainer" containerID="9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e" Apr 16 20:14:12.024333 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.024314 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e\": container with ID starting with 9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e not found: ID does not exist" containerID="9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e" Apr 16 20:14:12.024381 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024347 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e"} err="failed to get container status \"9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e\": rpc error: code = NotFound desc = could not find container \"9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e\": container with ID starting with 9f3c86bfcccb24b1c2d30ead287498339044f131da8c21c1c838f421bb223b4e not found: ID does not exist" Apr 16 20:14:12.024381 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024366 2574 scope.go:117] "RemoveContainer" containerID="e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff" Apr 16 20:14:12.024581 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.024568 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff\": container with ID starting with e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff not found: ID does not exist" containerID="e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff" Apr 16 20:14:12.024667 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024584 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff"} err="failed to get container status \"e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff\": rpc error: code = NotFound desc = could not find container \"e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff\": container with ID starting with e4dfdfe9c9f566a7fd0ba97c64c37dd51ed2c2fa2d5daf4219275c788834d7ff not found: ID does not exist" Apr 16 20:14:12.024667 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024595 2574 scope.go:117] "RemoveContainer" containerID="bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875" Apr 16 20:14:12.024814 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.024799 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875\": container with ID starting with bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875 not found: ID does not exist" containerID="bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875" Apr 16 20:14:12.024854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024816 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875"} err="failed to get container status \"bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875\": rpc error: code = NotFound desc = could not find container \"bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875\": container with ID starting with bb5b5742368ae13fd6863b14ef772a42ddd9deb0cb20e8324dea26bbcc461875 not found: ID does not exist" Apr 16 20:14:12.024854 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.024829 2574 scope.go:117] "RemoveContainer" containerID="e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3" Apr 16 20:14:12.025038 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.025022 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3\": container with ID starting with e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3 not found: ID does not exist" containerID="e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3" Apr 16 20:14:12.025073 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025043 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3"} err="failed to get container status \"e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3\": rpc error: code = NotFound desc = could not find container \"e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3\": container with ID starting with e5612230f85aaf8c538da28504aaa1c2931e5d52a5cf2b1e93b6fb4b6cb0b0b3 not found: ID does not exist" Apr 16 20:14:12.025073 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025063 2574 scope.go:117] "RemoveContainer" containerID="d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12" Apr 16 20:14:12.025285 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.025266 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12\": container with ID starting with d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12 not found: ID does not exist" containerID="d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12" Apr 16 20:14:12.025330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025289 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12"} err="failed to get container status \"d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12\": rpc error: code = NotFound desc = could not find container \"d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12\": container with ID starting with d761abbecb386c950fa7b08ff92cd7f1c42ce90a326186b90d8ff7c9d761db12 not found: ID does not exist" Apr 16 20:14:12.025330 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025303 2574 scope.go:117] "RemoveContainer" containerID="30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23" Apr 16 20:14:12.025527 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.025511 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23\": container with ID starting with 30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23 not found: ID does not exist" containerID="30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23" Apr 16 20:14:12.025563 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025531 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23"} err="failed to get container status \"30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23\": rpc error: code = NotFound desc = could not find container \"30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23\": container with ID starting with 30136029ec9e09ef02774daf1d2bc1aac32c123b6112186cf19dc91755640d23 not found: ID does not exist" Apr 16 20:14:12.025563 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025546 2574 scope.go:117] "RemoveContainer" containerID="323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c" Apr 16 20:14:12.025749 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:12.025733 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c\": container with ID starting with 323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c not found: ID does not exist" containerID="323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c" Apr 16 20:14:12.025792 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.025753 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c"} err="failed to get container status \"323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c\": rpc error: code = NotFound desc = could not find container \"323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c\": container with ID starting with 323229a9637d3a364626d2ebb37e915945dad19ebb9de3d7efbd65a23c01307c not found: ID does not exist" Apr 16 20:14:12.070165 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070138 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:12.070433 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070419 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070435 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070449 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="thanos-sidecar" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070458 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="thanos-sidecar" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070475 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="init-config-reloader" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070481 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="init-config-reloader" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070488 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="config-reloader" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070493 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="config-reloader" Apr 16 20:14:12.070501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070500 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="prometheus" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070506 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="prometheus" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070515 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-thanos" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070520 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-thanos" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070538 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7656bf12-277a-4f69-8c48-491875a7c565" containerName="console" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070546 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7656bf12-277a-4f69-8c48-491875a7c565" containerName="console" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070553 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-web" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070559 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-web" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070612 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-web" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070623 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070629 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="kube-rbac-proxy-thanos" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070635 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="thanos-sidecar" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070643 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7656bf12-277a-4f69-8c48-491875a7c565" containerName="console" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070649 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="prometheus" Apr 16 20:14:12.070732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.070656 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" containerName="config-reloader" Apr 16 20:14:12.075699 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.075682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.078841 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.078823 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:14:12.079371 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079357 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:14:12.079371 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079365 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:14:12.079510 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079365 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4brmovh2s1os\"" Apr 16 20:14:12.079510 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079415 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:14:12.079510 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079421 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:14:12.079819 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079801 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:14:12.079885 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.079843 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:14:12.080509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.080312 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:14:12.080509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.080328 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:14:12.080509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.080328 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-zdsnm\"" Apr 16 20:14:12.080509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.080345 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:14:12.082401 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.082380 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:14:12.085725 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.085701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:14:12.088636 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.088613 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:12.103870 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.103847 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.103959 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.103892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.103959 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.103917 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104032 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.103995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104032 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104096 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104096 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104096 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-web-config\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104224 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104224 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104224 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfd231ea-71e7-4ccd-a622-53f9c9762097-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104224 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104224 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r9tk\" (UniqueName: \"kubernetes.io/projected/cfd231ea-71e7-4ccd-a622-53f9c9762097-kube-api-access-2r9tk\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-config\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfd231ea-71e7-4ccd-a622-53f9c9762097-config-out\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.104389 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.104345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205448 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfd231ea-71e7-4ccd-a622-53f9c9762097-config-out\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.205646 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205633 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-web-config\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfd231ea-71e7-4ccd-a622-53f9c9762097-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r9tk\" (UniqueName: \"kubernetes.io/projected/cfd231ea-71e7-4ccd-a622-53f9c9762097-kube-api-access-2r9tk\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-config\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.205960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206587 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.206345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.206673 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.206652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.207195 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.207168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.208973 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.208945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.209183 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.209161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfd231ea-71e7-4ccd-a622-53f9c9762097-config-out\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.209271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.209196 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.209606 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.209570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.209836 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.209816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfd231ea-71e7-4ccd-a622-53f9c9762097-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.209942 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.209836 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.209942 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.209903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfd231ea-71e7-4ccd-a622-53f9c9762097-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.210171 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.210148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-web-config\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.210562 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.210541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.210783 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.210763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.210783 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.210778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.211606 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.211592 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-config\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.211734 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.211719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfd231ea-71e7-4ccd-a622-53f9c9762097-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.216524 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.216502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r9tk\" (UniqueName: \"kubernetes.io/projected/cfd231ea-71e7-4ccd-a622-53f9c9762097-kube-api-access-2r9tk\") pod \"prometheus-k8s-0\" (UID: \"cfd231ea-71e7-4ccd-a622-53f9c9762097\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.385962 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.385922 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:12.543386 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.543359 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:14:12.543989 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:14:12.543966 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd231ea_71e7_4ccd_a622_53f9c9762097.slice/crio-d937748e31bfc5d5d8084d602401248ad67170f0cbd3fd439ddf2c6a0b3c1051 WatchSource:0}: Error finding container d937748e31bfc5d5d8084d602401248ad67170f0cbd3fd439ddf2c6a0b3c1051: Status 404 returned error can't find the container with id d937748e31bfc5d5d8084d602401248ad67170f0cbd3fd439ddf2c6a0b3c1051 Apr 16 20:14:12.983494 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.983469 2574 generic.go:358] "Generic (PLEG): container finished" podID="cfd231ea-71e7-4ccd-a622-53f9c9762097" containerID="ee15906cb38a0e51515cdc03a7ae280d7d156c35e79c9294e1f84e23e5092fbd" exitCode=0 Apr 16 20:14:12.983649 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.983503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerDied","Data":"ee15906cb38a0e51515cdc03a7ae280d7d156c35e79c9294e1f84e23e5092fbd"} Apr 16 20:14:12.983649 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:12.983522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"d937748e31bfc5d5d8084d602401248ad67170f0cbd3fd439ddf2c6a0b3c1051"} Apr 16 20:14:13.397380 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.397349 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2440bb23-1627-4bc0-8d2b-50cbf4c470e4" path="/var/lib/kubelet/pods/2440bb23-1627-4bc0-8d2b-50cbf4c470e4/volumes" Apr 16 20:14:13.990878 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.990840 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"a7e3b63d53df03d6ec4f24a10fe9ba0af5d6d8c03f6a543a62938117cd1c9fa9"} Apr 16 20:14:13.990878 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.990879 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"e689358efcc1688010836c218108c2949a988ca543a9d32164306c9c21851592"} Apr 16 20:14:13.991353 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.990893 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"74a92bc491ab2b4893487a976b4004c0ff2b4aa52b3c674c05c268cc6b2644fa"} Apr 16 20:14:13.991353 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.990905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"721d8a123e065680084a7b2d3329c6af4eb9b034edc5fcdb921e86e789db1e77"} Apr 16 20:14:13.991353 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.990915 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"299802fc7da975f70e8275090d35e57f95adbc128d515308b56d36e91be906ea"} Apr 16 20:14:13.991353 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:13.990925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cfd231ea-71e7-4ccd-a622-53f9c9762097","Type":"ContainerStarted","Data":"90916048c3650965884acf5bc5d8a39a68ac67fdc019b42e5a2027994a973b09"} Apr 16 20:14:14.020961 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:14.020895 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.020878103 podStartE2EDuration="2.020878103s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:14.019189937 +0000 UTC m=+151.194259106" watchObservedRunningTime="2026-04-16 20:14:14.020878103 +0000 UTC m=+151.195947272" Apr 16 20:14:17.386758 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:17.386724 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:14:25.564856 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:25.564823 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64dc664459-q7jc9"] Apr 16 20:14:50.583551 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.583490 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64dc664459-q7jc9" podUID="dc6a280e-594f-4385-8810-45fc682a3aff" containerName="console" containerID="cri-o://a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de" gracePeriod=15 Apr 16 20:14:50.823582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.823559 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64dc664459-q7jc9_dc6a280e-594f-4385-8810-45fc682a3aff/console/0.log" Apr 16 20:14:50.823706 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.823620 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:14:50.929168 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929067 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-trusted-ca-bundle\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929168 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929119 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl2mr\" (UniqueName: \"kubernetes.io/projected/dc6a280e-594f-4385-8810-45fc682a3aff-kube-api-access-bl2mr\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929405 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929247 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-service-ca\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929405 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929295 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-serving-cert\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929405 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929324 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-oauth-serving-cert\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929563 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929438 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-console-config\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929563 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929504 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-oauth-config\") pod \"dc6a280e-594f-4385-8810-45fc682a3aff\" (UID: \"dc6a280e-594f-4385-8810-45fc682a3aff\") " Apr 16 20:14:50.929670 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929568 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:50.929670 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929621 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-service-ca" (OuterVolumeSpecName: "service-ca") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:50.929762 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929637 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:50.929846 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929817 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-console-config" (OuterVolumeSpecName: "console-config") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:50.929967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929875 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-service-ca\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:50.929967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929891 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-oauth-serving-cert\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:50.929967 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.929903 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-trusted-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:50.931603 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.931582 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:50.931839 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.931819 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6a280e-594f-4385-8810-45fc682a3aff-kube-api-access-bl2mr" (OuterVolumeSpecName: "kube-api-access-bl2mr") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "kube-api-access-bl2mr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:50.931884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:50.931823 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dc6a280e-594f-4385-8810-45fc682a3aff" (UID: "dc6a280e-594f-4385-8810-45fc682a3aff"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:51.030423 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.030383 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-serving-cert\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:51.030423 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.030415 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc6a280e-594f-4385-8810-45fc682a3aff-console-config\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:51.030423 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.030425 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc6a280e-594f-4385-8810-45fc682a3aff-console-oauth-config\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:51.030640 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.030435 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bl2mr\" (UniqueName: \"kubernetes.io/projected/dc6a280e-594f-4385-8810-45fc682a3aff-kube-api-access-bl2mr\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:14:51.094196 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.094167 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64dc664459-q7jc9_dc6a280e-594f-4385-8810-45fc682a3aff/console/0.log" Apr 16 20:14:51.094336 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.094206 2574 generic.go:358] "Generic (PLEG): container finished" podID="dc6a280e-594f-4385-8810-45fc682a3aff" containerID="a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de" exitCode=2 Apr 16 20:14:51.094336 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.094238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64dc664459-q7jc9" event={"ID":"dc6a280e-594f-4385-8810-45fc682a3aff","Type":"ContainerDied","Data":"a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de"} Apr 16 20:14:51.094336 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.094278 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64dc664459-q7jc9" Apr 16 20:14:51.094336 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.094291 2574 scope.go:117] "RemoveContainer" containerID="a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de" Apr 16 20:14:51.094520 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.094280 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64dc664459-q7jc9" event={"ID":"dc6a280e-594f-4385-8810-45fc682a3aff","Type":"ContainerDied","Data":"8879593f1e5fbbc610d7a5ea9d4f2d2e0e5cdbc3a65b2caea1d2e4f1f0872f05"} Apr 16 20:14:51.102063 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.102046 2574 scope.go:117] "RemoveContainer" containerID="a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de" Apr 16 20:14:51.102338 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:14:51.102317 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de\": container with ID starting with a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de not found: ID does not exist" containerID="a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de" Apr 16 20:14:51.102409 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.102351 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de"} err="failed to get container status \"a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de\": rpc error: code = NotFound desc = could not find container \"a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de\": container with ID starting with a30529f46bd5b626e610fe52a4c33c8d5432b62129bdb62c10ac5f45de1896de not found: ID does not exist" Apr 16 20:14:51.116884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.116852 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64dc664459-q7jc9"] Apr 16 20:14:51.124855 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.124824 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64dc664459-q7jc9"] Apr 16 20:14:51.396267 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:14:51.396234 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc6a280e-594f-4385-8810-45fc682a3aff" path="/var/lib/kubelet/pods/dc6a280e-594f-4385-8810-45fc682a3aff/volumes" Apr 16 20:15:12.386986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:15:12.386951 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:12.401907 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:15:12.401868 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:15:13.168928 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:15:13.168898 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:17:42.298163 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.298127 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-7kgxr"] Apr 16 20:17:42.298598 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.298413 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc6a280e-594f-4385-8810-45fc682a3aff" containerName="console" Apr 16 20:17:42.298598 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.298425 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6a280e-594f-4385-8810-45fc682a3aff" containerName="console" Apr 16 20:17:42.298598 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.298474 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc6a280e-594f-4385-8810-45fc682a3aff" containerName="console" Apr 16 20:17:42.301260 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.301239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.303815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.303787 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:17:42.303815 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.303788 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:17:42.303984 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.303833 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-w6crc\"" Apr 16 20:17:42.304770 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.304755 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:17:42.310445 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.310425 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7kgxr"] Apr 16 20:17:42.402243 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.402213 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbvv\" (UniqueName: \"kubernetes.io/projected/a2e072ed-f2ae-4d03-b630-d108a0c477c8-kube-api-access-mpbvv\") pod \"seaweedfs-86cc847c5c-7kgxr\" (UID: \"a2e072ed-f2ae-4d03-b630-d108a0c477c8\") " pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.402403 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.402256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a2e072ed-f2ae-4d03-b630-d108a0c477c8-data\") pod \"seaweedfs-86cc847c5c-7kgxr\" (UID: \"a2e072ed-f2ae-4d03-b630-d108a0c477c8\") " pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.502822 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.502792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbvv\" (UniqueName: \"kubernetes.io/projected/a2e072ed-f2ae-4d03-b630-d108a0c477c8-kube-api-access-mpbvv\") pod \"seaweedfs-86cc847c5c-7kgxr\" (UID: \"a2e072ed-f2ae-4d03-b630-d108a0c477c8\") " pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.502990 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.502827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a2e072ed-f2ae-4d03-b630-d108a0c477c8-data\") pod \"seaweedfs-86cc847c5c-7kgxr\" (UID: \"a2e072ed-f2ae-4d03-b630-d108a0c477c8\") " pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.503242 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.503222 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a2e072ed-f2ae-4d03-b630-d108a0c477c8-data\") pod \"seaweedfs-86cc847c5c-7kgxr\" (UID: \"a2e072ed-f2ae-4d03-b630-d108a0c477c8\") " pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.511196 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.511174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbvv\" (UniqueName: \"kubernetes.io/projected/a2e072ed-f2ae-4d03-b630-d108a0c477c8-kube-api-access-mpbvv\") pod \"seaweedfs-86cc847c5c-7kgxr\" (UID: \"a2e072ed-f2ae-4d03-b630-d108a0c477c8\") " pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.611391 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.611305 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:42.729450 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.729258 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7kgxr"] Apr 16 20:17:42.732493 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:17:42.732470 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e072ed_f2ae_4d03_b630_d108a0c477c8.slice/crio-96b9523f7b87b6b245ad932f2b1ff9bddf961ed48710178b0b35dfc19cc0fdea WatchSource:0}: Error finding container 96b9523f7b87b6b245ad932f2b1ff9bddf961ed48710178b0b35dfc19cc0fdea: Status 404 returned error can't find the container with id 96b9523f7b87b6b245ad932f2b1ff9bddf961ed48710178b0b35dfc19cc0fdea Apr 16 20:17:42.734455 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:42.734438 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:43.555501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:43.555458 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7kgxr" event={"ID":"a2e072ed-f2ae-4d03-b630-d108a0c477c8","Type":"ContainerStarted","Data":"96b9523f7b87b6b245ad932f2b1ff9bddf961ed48710178b0b35dfc19cc0fdea"} Apr 16 20:17:45.620377 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:45.620347 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:17:46.564398 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:46.564362 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7kgxr" event={"ID":"a2e072ed-f2ae-4d03-b630-d108a0c477c8","Type":"ContainerStarted","Data":"7afb540d9d6f0dde469266474bdd7e425c409edff163cf612c9134baed30e97f"} Apr 16 20:17:46.564569 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:46.564482 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:17:46.581120 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:46.581065 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-7kgxr" podStartSLOduration=1.697660162 podStartE2EDuration="4.581046484s" podCreationTimestamp="2026-04-16 20:17:42 +0000 UTC" firstStartedPulling="2026-04-16 20:17:42.734561734 +0000 UTC m=+359.909630881" lastFinishedPulling="2026-04-16 20:17:45.617948057 +0000 UTC m=+362.793017203" observedRunningTime="2026-04-16 20:17:46.579630572 +0000 UTC m=+363.754699736" watchObservedRunningTime="2026-04-16 20:17:46.581046484 +0000 UTC m=+363.756115652" Apr 16 20:17:52.574289 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:17:52.574258 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-7kgxr" Apr 16 20:18:53.035536 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.035499 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-dkl42"] Apr 16 20:18:53.038822 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.038801 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.041895 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.041873 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:18:53.041895 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.041882 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vk2rj\"" Apr 16 20:18:53.048951 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.048929 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dkl42"] Apr 16 20:18:53.051014 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.050990 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gr82t"] Apr 16 20:18:53.054088 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.054073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.056941 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.056920 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-dsf9c\"" Apr 16 20:18:53.057032 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.056962 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:18:53.063562 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.063537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gr82t"] Apr 16 20:18:53.149616 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.149578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba76c1a-d074-4aef-9798-9cd3ce4a7826-tls-certs\") pod \"model-serving-api-86f7b4b499-dkl42\" (UID: \"3ba76c1a-d074-4aef-9798-9cd3ce4a7826\") " pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.149805 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.149648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6l2m\" (UniqueName: \"kubernetes.io/projected/3ba76c1a-d074-4aef-9798-9cd3ce4a7826-kube-api-access-k6l2m\") pod \"model-serving-api-86f7b4b499-dkl42\" (UID: \"3ba76c1a-d074-4aef-9798-9cd3ce4a7826\") " pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.250571 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.250529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba76c1a-d074-4aef-9798-9cd3ce4a7826-tls-certs\") pod \"model-serving-api-86f7b4b499-dkl42\" (UID: \"3ba76c1a-d074-4aef-9798-9cd3ce4a7826\") " pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.250772 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.250580 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdaa442-7025-41c6-989e-5d1f82822423-cert\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.250772 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.250630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7fzh\" (UniqueName: \"kubernetes.io/projected/5cdaa442-7025-41c6-989e-5d1f82822423-kube-api-access-r7fzh\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.250772 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.250670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6l2m\" (UniqueName: \"kubernetes.io/projected/3ba76c1a-d074-4aef-9798-9cd3ce4a7826-kube-api-access-k6l2m\") pod \"model-serving-api-86f7b4b499-dkl42\" (UID: \"3ba76c1a-d074-4aef-9798-9cd3ce4a7826\") " pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.253133 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.253087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba76c1a-d074-4aef-9798-9cd3ce4a7826-tls-certs\") pod \"model-serving-api-86f7b4b499-dkl42\" (UID: \"3ba76c1a-d074-4aef-9798-9cd3ce4a7826\") " pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.261947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.261922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6l2m\" (UniqueName: \"kubernetes.io/projected/3ba76c1a-d074-4aef-9798-9cd3ce4a7826-kube-api-access-k6l2m\") pod \"model-serving-api-86f7b4b499-dkl42\" (UID: \"3ba76c1a-d074-4aef-9798-9cd3ce4a7826\") " pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.349982 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.349897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:53.351813 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.351791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7fzh\" (UniqueName: \"kubernetes.io/projected/5cdaa442-7025-41c6-989e-5d1f82822423-kube-api-access-r7fzh\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.351876 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.351863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdaa442-7025-41c6-989e-5d1f82822423-cert\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.351993 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:18:53.351978 2574 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 20:18:53.352052 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:18:53.352043 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cdaa442-7025-41c6-989e-5d1f82822423-cert podName:5cdaa442-7025-41c6-989e-5d1f82822423 nodeName:}" failed. No retries permitted until 2026-04-16 20:18:53.852022303 +0000 UTC m=+431.027091454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5cdaa442-7025-41c6-989e-5d1f82822423-cert") pod "odh-model-controller-696fc77849-gr82t" (UID: "5cdaa442-7025-41c6-989e-5d1f82822423") : secret "odh-model-controller-webhook-cert" not found Apr 16 20:18:53.359857 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.359827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7fzh\" (UniqueName: \"kubernetes.io/projected/5cdaa442-7025-41c6-989e-5d1f82822423-kube-api-access-r7fzh\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.469835 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.469808 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dkl42"] Apr 16 20:18:53.472528 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:18:53.472495 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ba76c1a_d074_4aef_9798_9cd3ce4a7826.slice/crio-a905ef6d1c91e2821b97a57f2dc3105e028bf5edc460a1f3c7859d69c20e3f49 WatchSource:0}: Error finding container a905ef6d1c91e2821b97a57f2dc3105e028bf5edc460a1f3c7859d69c20e3f49: Status 404 returned error can't find the container with id a905ef6d1c91e2821b97a57f2dc3105e028bf5edc460a1f3c7859d69c20e3f49 Apr 16 20:18:53.752672 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.752640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dkl42" event={"ID":"3ba76c1a-d074-4aef-9798-9cd3ce4a7826","Type":"ContainerStarted","Data":"a905ef6d1c91e2821b97a57f2dc3105e028bf5edc460a1f3c7859d69c20e3f49"} Apr 16 20:18:53.856205 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.856168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdaa442-7025-41c6-989e-5d1f82822423-cert\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.858517 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.858500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdaa442-7025-41c6-989e-5d1f82822423-cert\") pod \"odh-model-controller-696fc77849-gr82t\" (UID: \"5cdaa442-7025-41c6-989e-5d1f82822423\") " pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:53.966675 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:53.966637 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:54.107519 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:54.107356 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gr82t"] Apr 16 20:18:54.110845 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:18:54.110811 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cdaa442_7025_41c6_989e_5d1f82822423.slice/crio-7ec89fa06341aeef7af5d44bf7201d191fc5d7a6d807ac802a43d425a19dfce3 WatchSource:0}: Error finding container 7ec89fa06341aeef7af5d44bf7201d191fc5d7a6d807ac802a43d425a19dfce3: Status 404 returned error can't find the container with id 7ec89fa06341aeef7af5d44bf7201d191fc5d7a6d807ac802a43d425a19dfce3 Apr 16 20:18:54.758548 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:54.758485 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gr82t" event={"ID":"5cdaa442-7025-41c6-989e-5d1f82822423","Type":"ContainerStarted","Data":"7ec89fa06341aeef7af5d44bf7201d191fc5d7a6d807ac802a43d425a19dfce3"} Apr 16 20:18:57.768404 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:57.768358 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gr82t" event={"ID":"5cdaa442-7025-41c6-989e-5d1f82822423","Type":"ContainerStarted","Data":"2b7e025b454b39f32767786aced5a9b2f783f0e17ebb9f8f3c34cb19cb47ac08"} Apr 16 20:18:57.768870 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:57.768505 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:18:57.769765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:57.769744 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dkl42" event={"ID":"3ba76c1a-d074-4aef-9798-9cd3ce4a7826","Type":"ContainerStarted","Data":"83ae09f55472004ab466334808955a77597bd49ddcc63f6948011e5d6c7fe5e4"} Apr 16 20:18:57.769915 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:57.769899 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:18:57.786284 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:57.786230 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gr82t" podStartSLOduration=1.736526628 podStartE2EDuration="4.786216691s" podCreationTimestamp="2026-04-16 20:18:53 +0000 UTC" firstStartedPulling="2026-04-16 20:18:54.113221982 +0000 UTC m=+431.288291134" lastFinishedPulling="2026-04-16 20:18:57.162912048 +0000 UTC m=+434.337981197" observedRunningTime="2026-04-16 20:18:57.784923145 +0000 UTC m=+434.959992311" watchObservedRunningTime="2026-04-16 20:18:57.786216691 +0000 UTC m=+434.961285883" Apr 16 20:18:57.801318 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:18:57.801277 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-dkl42" podStartSLOduration=1.116622261 podStartE2EDuration="4.801265616s" podCreationTimestamp="2026-04-16 20:18:53 +0000 UTC" firstStartedPulling="2026-04-16 20:18:53.474236054 +0000 UTC m=+430.649305201" lastFinishedPulling="2026-04-16 20:18:57.15887941 +0000 UTC m=+434.333948556" observedRunningTime="2026-04-16 20:18:57.799447939 +0000 UTC m=+434.974517107" watchObservedRunningTime="2026-04-16 20:18:57.801265616 +0000 UTC m=+434.976334783" Apr 16 20:19:08.774923 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:08.774851 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gr82t" Apr 16 20:19:08.777036 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:08.777017 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-dkl42" Apr 16 20:19:09.616704 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.616671 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jvcft"] Apr 16 20:19:09.619697 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.619679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jvcft" Apr 16 20:19:09.628171 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.628140 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jvcft"] Apr 16 20:19:09.688339 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.688303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znd68\" (UniqueName: \"kubernetes.io/projected/d4c6cc98-1c17-46e0-ae8e-b749d4c1775c-kube-api-access-znd68\") pod \"s3-init-jvcft\" (UID: \"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c\") " pod="kserve/s3-init-jvcft" Apr 16 20:19:09.788953 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.788907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znd68\" (UniqueName: \"kubernetes.io/projected/d4c6cc98-1c17-46e0-ae8e-b749d4c1775c-kube-api-access-znd68\") pod \"s3-init-jvcft\" (UID: \"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c\") " pod="kserve/s3-init-jvcft" Apr 16 20:19:09.796756 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.796727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znd68\" (UniqueName: \"kubernetes.io/projected/d4c6cc98-1c17-46e0-ae8e-b749d4c1775c-kube-api-access-znd68\") pod \"s3-init-jvcft\" (UID: \"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c\") " pod="kserve/s3-init-jvcft" Apr 16 20:19:09.941775 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:09.941676 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jvcft" Apr 16 20:19:10.062735 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:10.062697 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jvcft"] Apr 16 20:19:10.065699 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:19:10.065657 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4c6cc98_1c17_46e0_ae8e_b749d4c1775c.slice/crio-ba34cfb73d03c2f1ed10b424b5e4c082b63ed01af1795fbd1c02e6bef5d8642e WatchSource:0}: Error finding container ba34cfb73d03c2f1ed10b424b5e4c082b63ed01af1795fbd1c02e6bef5d8642e: Status 404 returned error can't find the container with id ba34cfb73d03c2f1ed10b424b5e4c082b63ed01af1795fbd1c02e6bef5d8642e Apr 16 20:19:10.812752 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:10.812712 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jvcft" event={"ID":"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c","Type":"ContainerStarted","Data":"ba34cfb73d03c2f1ed10b424b5e4c082b63ed01af1795fbd1c02e6bef5d8642e"} Apr 16 20:19:14.828748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:14.828653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jvcft" event={"ID":"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c","Type":"ContainerStarted","Data":"24d03fd9767a3b63e59c7b0cc6e6714ee377d9127c56fd8c52450beb9c004f5e"} Apr 16 20:19:14.846161 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:14.846081 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jvcft" podStartSLOduration=1.407983578 podStartE2EDuration="5.846064169s" podCreationTimestamp="2026-04-16 20:19:09 +0000 UTC" firstStartedPulling="2026-04-16 20:19:10.067585595 +0000 UTC m=+447.242654741" lastFinishedPulling="2026-04-16 20:19:14.505666172 +0000 UTC m=+451.680735332" observedRunningTime="2026-04-16 20:19:14.845948327 +0000 UTC m=+452.021017496" watchObservedRunningTime="2026-04-16 20:19:14.846064169 +0000 UTC m=+452.021133337" Apr 16 20:19:17.838629 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:17.838596 2574 generic.go:358] "Generic (PLEG): container finished" podID="d4c6cc98-1c17-46e0-ae8e-b749d4c1775c" containerID="24d03fd9767a3b63e59c7b0cc6e6714ee377d9127c56fd8c52450beb9c004f5e" exitCode=0 Apr 16 20:19:17.839066 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:17.838673 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jvcft" event={"ID":"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c","Type":"ContainerDied","Data":"24d03fd9767a3b63e59c7b0cc6e6714ee377d9127c56fd8c52450beb9c004f5e"} Apr 16 20:19:18.961988 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:18.961965 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jvcft" Apr 16 20:19:19.065614 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:19.065578 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znd68\" (UniqueName: \"kubernetes.io/projected/d4c6cc98-1c17-46e0-ae8e-b749d4c1775c-kube-api-access-znd68\") pod \"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c\" (UID: \"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c\") " Apr 16 20:19:19.067701 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:19.067674 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c6cc98-1c17-46e0-ae8e-b749d4c1775c-kube-api-access-znd68" (OuterVolumeSpecName: "kube-api-access-znd68") pod "d4c6cc98-1c17-46e0-ae8e-b749d4c1775c" (UID: "d4c6cc98-1c17-46e0-ae8e-b749d4c1775c"). InnerVolumeSpecName "kube-api-access-znd68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:19.166983 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:19.166908 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znd68\" (UniqueName: \"kubernetes.io/projected/d4c6cc98-1c17-46e0-ae8e-b749d4c1775c-kube-api-access-znd68\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:19:19.846037 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:19.845988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jvcft" event={"ID":"d4c6cc98-1c17-46e0-ae8e-b749d4c1775c","Type":"ContainerDied","Data":"ba34cfb73d03c2f1ed10b424b5e4c082b63ed01af1795fbd1c02e6bef5d8642e"} Apr 16 20:19:19.846037 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:19.846025 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jvcft" Apr 16 20:19:19.846037 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:19.846033 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba34cfb73d03c2f1ed10b424b5e4c082b63ed01af1795fbd1c02e6bef5d8642e" Apr 16 20:19:28.897433 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.897403 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn"] Apr 16 20:19:28.897909 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.897728 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4c6cc98-1c17-46e0-ae8e-b749d4c1775c" containerName="s3-init" Apr 16 20:19:28.897909 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.897743 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c6cc98-1c17-46e0-ae8e-b749d4c1775c" containerName="s3-init" Apr 16 20:19:28.897909 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.897797 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4c6cc98-1c17-46e0-ae8e-b749d4c1775c" containerName="s3-init" Apr 16 20:19:28.903836 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.903814 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:19:28.906304 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.906273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5hj24\"" Apr 16 20:19:28.907832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:28.907805 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn"] Apr 16 20:19:29.061447 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.061413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/432f4206-43c5-40ef-abd2-fd526131a888-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn\" (UID: \"432f4206-43c5-40ef-abd2-fd526131a888\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:19:29.162355 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.162255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/432f4206-43c5-40ef-abd2-fd526131a888-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn\" (UID: \"432f4206-43c5-40ef-abd2-fd526131a888\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:19:29.162651 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.162633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/432f4206-43c5-40ef-abd2-fd526131a888-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn\" (UID: \"432f4206-43c5-40ef-abd2-fd526131a888\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:19:29.214378 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.214342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:19:29.232573 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.232545 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q"] Apr 16 20:19:29.237732 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.237708 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:19:29.247064 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.247039 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q"] Apr 16 20:19:29.343521 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.343490 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn"] Apr 16 20:19:29.346391 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:19:29.346364 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432f4206_43c5_40ef_abd2_fd526131a888.slice/crio-304c93107c0a92dced8e6da8b150ba9cbdd1ba1f7f64bd2dcd6f63853da76992 WatchSource:0}: Error finding container 304c93107c0a92dced8e6da8b150ba9cbdd1ba1f7f64bd2dcd6f63853da76992: Status 404 returned error can't find the container with id 304c93107c0a92dced8e6da8b150ba9cbdd1ba1f7f64bd2dcd6f63853da76992 Apr 16 20:19:29.364428 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.364406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1c4a0e5-d213-4def-9f4e-188d05a7e679-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-qqb9q\" (UID: \"a1c4a0e5-d213-4def-9f4e-188d05a7e679\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:19:29.407037 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.407000 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6"] Apr 16 20:19:29.410877 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.410842 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:19:29.418813 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.418790 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6"] Apr 16 20:19:29.465745 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.465705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1c4a0e5-d213-4def-9f4e-188d05a7e679-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-qqb9q\" (UID: \"a1c4a0e5-d213-4def-9f4e-188d05a7e679\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:19:29.471835 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.471803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1c4a0e5-d213-4def-9f4e-188d05a7e679-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-qqb9q\" (UID: \"a1c4a0e5-d213-4def-9f4e-188d05a7e679\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:19:29.549738 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.549694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:19:29.567417 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.567389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5395df40-d086-40a3-8f00-190870297e9f-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6\" (UID: \"5395df40-d086-40a3-8f00-190870297e9f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:19:29.668478 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.668407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5395df40-d086-40a3-8f00-190870297e9f-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6\" (UID: \"5395df40-d086-40a3-8f00-190870297e9f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:19:29.668781 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.668760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5395df40-d086-40a3-8f00-190870297e9f-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6\" (UID: \"5395df40-d086-40a3-8f00-190870297e9f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:19:29.682078 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.682053 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q"] Apr 16 20:19:29.683803 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:19:29.683775 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c4a0e5_d213_4def_9f4e_188d05a7e679.slice/crio-5be9a5eeb356cfc54021069a609082a677f6a3e750f2a1d8b687a5ae714a58ac WatchSource:0}: Error finding container 5be9a5eeb356cfc54021069a609082a677f6a3e750f2a1d8b687a5ae714a58ac: Status 404 returned error can't find the container with id 5be9a5eeb356cfc54021069a609082a677f6a3e750f2a1d8b687a5ae714a58ac Apr 16 20:19:29.722103 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.722066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:19:29.849573 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.849548 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6"] Apr 16 20:19:29.851869 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:19:29.851839 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5395df40_d086_40a3_8f00_190870297e9f.slice/crio-01145f4f6d9befb9ad3952a7ccdfb148a15a974bdc3c623bb775641e6e648146 WatchSource:0}: Error finding container 01145f4f6d9befb9ad3952a7ccdfb148a15a974bdc3c623bb775641e6e648146: Status 404 returned error can't find the container with id 01145f4f6d9befb9ad3952a7ccdfb148a15a974bdc3c623bb775641e6e648146 Apr 16 20:19:29.874814 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.874776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" event={"ID":"5395df40-d086-40a3-8f00-190870297e9f","Type":"ContainerStarted","Data":"01145f4f6d9befb9ad3952a7ccdfb148a15a974bdc3c623bb775641e6e648146"} Apr 16 20:19:29.875864 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.875836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" event={"ID":"432f4206-43c5-40ef-abd2-fd526131a888","Type":"ContainerStarted","Data":"304c93107c0a92dced8e6da8b150ba9cbdd1ba1f7f64bd2dcd6f63853da76992"} Apr 16 20:19:29.876808 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:29.876783 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" event={"ID":"a1c4a0e5-d213-4def-9f4e-188d05a7e679","Type":"ContainerStarted","Data":"5be9a5eeb356cfc54021069a609082a677f6a3e750f2a1d8b687a5ae714a58ac"} Apr 16 20:19:34.899210 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:34.899166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" event={"ID":"5395df40-d086-40a3-8f00-190870297e9f","Type":"ContainerStarted","Data":"1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1"} Apr 16 20:19:34.901238 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:34.901207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" event={"ID":"432f4206-43c5-40ef-abd2-fd526131a888","Type":"ContainerStarted","Data":"9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d"} Apr 16 20:19:34.903567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:34.903536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" event={"ID":"a1c4a0e5-d213-4def-9f4e-188d05a7e679","Type":"ContainerStarted","Data":"26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83"} Apr 16 20:19:38.915150 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:38.915092 2574 generic.go:358] "Generic (PLEG): container finished" podID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerID="26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83" exitCode=0 Apr 16 20:19:38.915595 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:38.915168 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" event={"ID":"a1c4a0e5-d213-4def-9f4e-188d05a7e679","Type":"ContainerDied","Data":"26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83"} Apr 16 20:19:38.916480 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:38.916459 2574 generic.go:358] "Generic (PLEG): container finished" podID="5395df40-d086-40a3-8f00-190870297e9f" containerID="1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1" exitCode=0 Apr 16 20:19:38.916588 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:38.916527 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" event={"ID":"5395df40-d086-40a3-8f00-190870297e9f","Type":"ContainerDied","Data":"1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1"} Apr 16 20:19:38.917847 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:38.917796 2574 generic.go:358] "Generic (PLEG): container finished" podID="432f4206-43c5-40ef-abd2-fd526131a888" containerID="9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d" exitCode=0 Apr 16 20:19:38.917847 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:19:38.917837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" event={"ID":"432f4206-43c5-40ef-abd2-fd526131a888","Type":"ContainerDied","Data":"9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d"} Apr 16 20:20:05.030955 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.030912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" event={"ID":"5395df40-d086-40a3-8f00-190870297e9f","Type":"ContainerStarted","Data":"ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb"} Apr 16 20:20:05.031447 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.031227 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:20:05.032509 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.032486 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" event={"ID":"432f4206-43c5-40ef-abd2-fd526131a888","Type":"ContainerStarted","Data":"51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6"} Apr 16 20:20:05.032747 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.032704 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:05.032808 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.032747 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:20:05.033649 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.033628 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:05.048778 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.048721 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podStartSLOduration=1.123072058 podStartE2EDuration="36.048705586s" podCreationTimestamp="2026-04-16 20:19:29 +0000 UTC" firstStartedPulling="2026-04-16 20:19:29.853707351 +0000 UTC m=+467.028776497" lastFinishedPulling="2026-04-16 20:20:04.779340871 +0000 UTC m=+501.954410025" observedRunningTime="2026-04-16 20:20:05.046288137 +0000 UTC m=+502.221357315" watchObservedRunningTime="2026-04-16 20:20:05.048705586 +0000 UTC m=+502.223774755" Apr 16 20:20:05.061474 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:05.061418 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podStartSLOduration=1.630266115 podStartE2EDuration="37.06140002s" podCreationTimestamp="2026-04-16 20:19:28 +0000 UTC" firstStartedPulling="2026-04-16 20:19:29.348207843 +0000 UTC m=+466.523276990" lastFinishedPulling="2026-04-16 20:20:04.779341742 +0000 UTC m=+501.954410895" observedRunningTime="2026-04-16 20:20:05.060777161 +0000 UTC m=+502.235846328" watchObservedRunningTime="2026-04-16 20:20:05.06140002 +0000 UTC m=+502.236469189" Apr 16 20:20:06.036987 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:06.036951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" event={"ID":"a1c4a0e5-d213-4def-9f4e-188d05a7e679","Type":"ContainerStarted","Data":"2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4"} Apr 16 20:20:06.037448 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:06.037349 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:06.037448 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:06.037348 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:06.037531 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:06.037515 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:20:06.038790 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:06.038764 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:06.053888 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:06.053841 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podStartSLOduration=1.143308153 podStartE2EDuration="37.053828368s" podCreationTimestamp="2026-04-16 20:19:29 +0000 UTC" firstStartedPulling="2026-04-16 20:19:29.685711622 +0000 UTC m=+466.860780783" lastFinishedPulling="2026-04-16 20:20:05.596231849 +0000 UTC m=+502.771300998" observedRunningTime="2026-04-16 20:20:06.052819759 +0000 UTC m=+503.227888927" watchObservedRunningTime="2026-04-16 20:20:06.053828368 +0000 UTC m=+503.228897536" Apr 16 20:20:07.040692 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:07.040651 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:16.037630 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:16.037583 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:16.038013 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:16.037583 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:17.041194 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:17.041150 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:26.037647 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:26.037599 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:26.038036 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:26.037605 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:27.041321 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:27.041280 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:36.038296 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:36.038213 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:36.038762 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:36.038213 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:37.040836 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:37.040795 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:46.038033 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:46.037991 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:46.038487 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:46.037999 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:47.040736 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:47.040693 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:48.891276 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.891249 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg"] Apr 16 20:20:48.896256 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.896240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:48.898283 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.898254 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:20:48.898511 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.898486 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-19633-serving-cert\"" Apr 16 20:20:48.898631 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.898607 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-19633-kube-rbac-proxy-sar-config\"" Apr 16 20:20:48.902721 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.902701 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg"] Apr 16 20:20:48.966078 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.966045 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0afef93a-f293-4d32-a792-6189db0c28dd-openshift-service-ca-bundle\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:48.966230 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:48.966120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.066882 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.066849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0afef93a-f293-4d32-a792-6189db0c28dd-openshift-service-ca-bundle\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.067016 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.066889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.067062 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:20:49.067019 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-19633-serving-cert: secret "switch-graph-19633-serving-cert" not found Apr 16 20:20:49.067120 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:20:49.067096 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls podName:0afef93a-f293-4d32-a792-6189db0c28dd nodeName:}" failed. No retries permitted until 2026-04-16 20:20:49.567074912 +0000 UTC m=+546.742144059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls") pod "switch-graph-19633-6665f68b97-zqmlg" (UID: "0afef93a-f293-4d32-a792-6189db0c28dd") : secret "switch-graph-19633-serving-cert" not found Apr 16 20:20:49.067508 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.067492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0afef93a-f293-4d32-a792-6189db0c28dd-openshift-service-ca-bundle\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.571313 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.571276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.573460 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.573440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls\") pod \"switch-graph-19633-6665f68b97-zqmlg\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.806636 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.806588 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:49.923701 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:49.923676 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg"] Apr 16 20:20:49.925502 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:20:49.925472 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0afef93a_f293_4d32_a792_6189db0c28dd.slice/crio-388d27dc467ec6ed4021a06b4ff8eaba702096c995c544b47d9d52ca711daea0 WatchSource:0}: Error finding container 388d27dc467ec6ed4021a06b4ff8eaba702096c995c544b47d9d52ca711daea0: Status 404 returned error can't find the container with id 388d27dc467ec6ed4021a06b4ff8eaba702096c995c544b47d9d52ca711daea0 Apr 16 20:20:50.168369 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:50.168289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" event={"ID":"0afef93a-f293-4d32-a792-6189db0c28dd","Type":"ContainerStarted","Data":"388d27dc467ec6ed4021a06b4ff8eaba702096c995c544b47d9d52ca711daea0"} Apr 16 20:20:53.178289 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:53.178259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" event={"ID":"0afef93a-f293-4d32-a792-6189db0c28dd","Type":"ContainerStarted","Data":"a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377"} Apr 16 20:20:53.178687 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:53.178366 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:20:53.198697 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:53.198651 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podStartSLOduration=2.813847945 podStartE2EDuration="5.198637352s" podCreationTimestamp="2026-04-16 20:20:48 +0000 UTC" firstStartedPulling="2026-04-16 20:20:49.92724763 +0000 UTC m=+547.102316776" lastFinishedPulling="2026-04-16 20:20:52.312037026 +0000 UTC m=+549.487106183" observedRunningTime="2026-04-16 20:20:53.197042635 +0000 UTC m=+550.372111801" watchObservedRunningTime="2026-04-16 20:20:53.198637352 +0000 UTC m=+550.373706520" Apr 16 20:20:56.037744 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:56.037695 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:20:56.038155 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:56.037695 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:20:57.041584 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:57.041541 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 20:20:59.187727 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:20:59.187692 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:21:03.106606 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:03.106574 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg"] Apr 16 20:21:03.106997 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:03.106791 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" containerID="cri-o://a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377" gracePeriod=30 Apr 16 20:21:04.185923 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:04.185879 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:06.038332 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:06.038284 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 20:21:06.038791 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:06.038297 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:21:07.042296 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:07.042268 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:21:08.393290 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:08.393249 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 20:21:09.186254 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:09.186216 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:14.186486 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:14.186445 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:14.186884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:14.186562 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:21:16.039306 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:16.039275 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:21:18.395067 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:18.395024 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:21:19.186300 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:19.186263 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:24.186733 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:24.186692 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:29.186126 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:29.186069 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:33.271391 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.271362 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:21:33.300649 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.300612 2574 generic.go:358] "Generic (PLEG): container finished" podID="0afef93a-f293-4d32-a792-6189db0c28dd" containerID="a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377" exitCode=137 Apr 16 20:21:33.300812 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.300679 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" Apr 16 20:21:33.300812 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.300698 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" event={"ID":"0afef93a-f293-4d32-a792-6189db0c28dd","Type":"ContainerDied","Data":"a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377"} Apr 16 20:21:33.300812 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.300741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg" event={"ID":"0afef93a-f293-4d32-a792-6189db0c28dd","Type":"ContainerDied","Data":"388d27dc467ec6ed4021a06b4ff8eaba702096c995c544b47d9d52ca711daea0"} Apr 16 20:21:33.300812 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.300762 2574 scope.go:117] "RemoveContainer" containerID="a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377" Apr 16 20:21:33.308610 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.308592 2574 scope.go:117] "RemoveContainer" containerID="a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377" Apr 16 20:21:33.308862 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:33.308842 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377\": container with ID starting with a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377 not found: ID does not exist" containerID="a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377" Apr 16 20:21:33.308914 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.308871 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377"} err="failed to get container status \"a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377\": rpc error: code = NotFound desc = could not find container \"a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377\": container with ID starting with a6aa2cb5eb22e0e7e7ced6b2fbcf3e1ce26685011069af774b3c287d59a58377 not found: ID does not exist" Apr 16 20:21:33.339150 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.339092 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0afef93a-f293-4d32-a792-6189db0c28dd-openshift-service-ca-bundle\") pod \"0afef93a-f293-4d32-a792-6189db0c28dd\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " Apr 16 20:21:33.339331 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.339216 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls\") pod \"0afef93a-f293-4d32-a792-6189db0c28dd\" (UID: \"0afef93a-f293-4d32-a792-6189db0c28dd\") " Apr 16 20:21:33.339514 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.339490 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afef93a-f293-4d32-a792-6189db0c28dd-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0afef93a-f293-4d32-a792-6189db0c28dd" (UID: "0afef93a-f293-4d32-a792-6189db0c28dd"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:21:33.341301 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.341277 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0afef93a-f293-4d32-a792-6189db0c28dd" (UID: "0afef93a-f293-4d32-a792-6189db0c28dd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:21:33.439781 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.439756 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0afef93a-f293-4d32-a792-6189db0c28dd-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:21:33.439781 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.439785 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0afef93a-f293-4d32-a792-6189db0c28dd-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:21:33.615246 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.615213 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg"] Apr 16 20:21:33.617984 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:33.617959 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-19633-6665f68b97-zqmlg"] Apr 16 20:21:35.397284 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:35.397249 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" path="/var/lib/kubelet/pods/0afef93a-f293-4d32-a792-6189db0c28dd/volumes" Apr 16 20:21:38.852886 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.852853 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g"] Apr 16 20:21:38.853251 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.853211 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" Apr 16 20:21:38.853251 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.853225 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" Apr 16 20:21:38.853364 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.853278 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0afef93a-f293-4d32-a792-6189db0c28dd" containerName="switch-graph-19633" Apr 16 20:21:38.856213 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.856196 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:38.858357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.858337 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:21:38.858482 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.858365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 20:21:38.858482 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.858466 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 20:21:38.864613 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.864593 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g"] Apr 16 20:21:38.989993 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.989953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865e0095-0f83-4e87-b671-e40cf20e1399-proxy-tls\") pod \"model-chainer-5ddf86697d-c728g\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:38.990178 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:38.990000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865e0095-0f83-4e87-b671-e40cf20e1399-openshift-service-ca-bundle\") pod \"model-chainer-5ddf86697d-c728g\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:39.090899 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.090861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865e0095-0f83-4e87-b671-e40cf20e1399-proxy-tls\") pod \"model-chainer-5ddf86697d-c728g\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:39.090899 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.090898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865e0095-0f83-4e87-b671-e40cf20e1399-openshift-service-ca-bundle\") pod \"model-chainer-5ddf86697d-c728g\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:39.091585 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.091562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865e0095-0f83-4e87-b671-e40cf20e1399-openshift-service-ca-bundle\") pod \"model-chainer-5ddf86697d-c728g\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:39.093258 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.093238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865e0095-0f83-4e87-b671-e40cf20e1399-proxy-tls\") pod \"model-chainer-5ddf86697d-c728g\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:39.166496 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.166416 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:39.285345 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.285302 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g"] Apr 16 20:21:39.288203 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:21:39.288166 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865e0095_0f83_4e87_b671_e40cf20e1399.slice/crio-8a425eae2ffee1d974e81856e3fbe00a3e4851029b40cfe2a008cd00ee1fbdc2 WatchSource:0}: Error finding container 8a425eae2ffee1d974e81856e3fbe00a3e4851029b40cfe2a008cd00ee1fbdc2: Status 404 returned error can't find the container with id 8a425eae2ffee1d974e81856e3fbe00a3e4851029b40cfe2a008cd00ee1fbdc2 Apr 16 20:21:39.320527 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:39.320500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" event={"ID":"865e0095-0f83-4e87-b671-e40cf20e1399","Type":"ContainerStarted","Data":"8a425eae2ffee1d974e81856e3fbe00a3e4851029b40cfe2a008cd00ee1fbdc2"} Apr 16 20:21:40.325207 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:40.325169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" event={"ID":"865e0095-0f83-4e87-b671-e40cf20e1399","Type":"ContainerStarted","Data":"0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1"} Apr 16 20:21:40.325582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:40.325309 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:40.341862 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:40.341813 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podStartSLOduration=2.341799395 podStartE2EDuration="2.341799395s" podCreationTimestamp="2026-04-16 20:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:21:40.340273616 +0000 UTC m=+597.515342783" watchObservedRunningTime="2026-04-16 20:21:40.341799395 +0000 UTC m=+597.516868563" Apr 16 20:21:46.333840 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:46.333807 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:21:49.079803 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.079767 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g"] Apr 16 20:21:49.080372 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.079968 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" containerID="cri-o://0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1" gracePeriod=30 Apr 16 20:21:49.136125 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.136071 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q"] Apr 16 20:21:49.136413 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.136363 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" containerID="cri-o://2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4" gracePeriod=30 Apr 16 20:21:49.191053 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.191013 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6"] Apr 16 20:21:49.191340 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.191319 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" containerID="cri-o://ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb" gracePeriod=30 Apr 16 20:21:49.276283 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.276242 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn"] Apr 16 20:21:49.276527 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:49.276504 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" containerID="cri-o://51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6" gracePeriod=30 Apr 16 20:21:51.332927 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:51.332875 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:53.071334 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.071010 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:21:53.203340 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.203240 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1c4a0e5-d213-4def-9f4e-188d05a7e679-kserve-provision-location\") pod \"a1c4a0e5-d213-4def-9f4e-188d05a7e679\" (UID: \"a1c4a0e5-d213-4def-9f4e-188d05a7e679\") " Apr 16 20:21:53.203572 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.203547 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c4a0e5-d213-4def-9f4e-188d05a7e679-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a1c4a0e5-d213-4def-9f4e-188d05a7e679" (UID: "a1c4a0e5-d213-4def-9f4e-188d05a7e679"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:53.304348 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.304309 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1c4a0e5-d213-4def-9f4e-188d05a7e679-kserve-provision-location\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:21:53.364409 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.364376 2574 generic.go:358] "Generic (PLEG): container finished" podID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerID="2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4" exitCode=0 Apr 16 20:21:53.364579 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.364430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" event={"ID":"a1c4a0e5-d213-4def-9f4e-188d05a7e679","Type":"ContainerDied","Data":"2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4"} Apr 16 20:21:53.364579 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.364456 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" event={"ID":"a1c4a0e5-d213-4def-9f4e-188d05a7e679","Type":"ContainerDied","Data":"5be9a5eeb356cfc54021069a609082a677f6a3e750f2a1d8b687a5ae714a58ac"} Apr 16 20:21:53.364579 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.364461 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q" Apr 16 20:21:53.364579 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.364471 2574 scope.go:117] "RemoveContainer" containerID="2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4" Apr 16 20:21:53.372906 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.372876 2574 scope.go:117] "RemoveContainer" containerID="26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83" Apr 16 20:21:53.383960 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.383939 2574 scope.go:117] "RemoveContainer" containerID="2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4" Apr 16 20:21:53.384257 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:53.384232 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4\": container with ID starting with 2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4 not found: ID does not exist" containerID="2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4" Apr 16 20:21:53.384344 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.384271 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4"} err="failed to get container status \"2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4\": rpc error: code = NotFound desc = could not find container \"2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4\": container with ID starting with 2df1e0e78f37fe4b12f15aecc67ef0a6b07dd74cdeebc1bbbfe5035333b3f3d4 not found: ID does not exist" Apr 16 20:21:53.384344 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.384296 2574 scope.go:117] "RemoveContainer" containerID="26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83" Apr 16 20:21:53.384526 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:53.384510 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83\": container with ID starting with 26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83 not found: ID does not exist" containerID="26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83" Apr 16 20:21:53.384578 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.384534 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83"} err="failed to get container status \"26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83\": rpc error: code = NotFound desc = could not find container \"26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83\": container with ID starting with 26034ae7c0c2033d492d246d08b28c633ba9e407b51e0294253b04fbbd40ac83 not found: ID does not exist" Apr 16 20:21:53.386918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.386898 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q"] Apr 16 20:21:53.390527 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.390497 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-qqb9q"] Apr 16 20:21:53.397006 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.396984 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" path="/var/lib/kubelet/pods/a1c4a0e5-d213-4def-9f4e-188d05a7e679/volumes" Apr 16 20:21:53.825571 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.825546 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:21:53.909063 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.909037 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5395df40-d086-40a3-8f00-190870297e9f-kserve-provision-location\") pod \"5395df40-d086-40a3-8f00-190870297e9f\" (UID: \"5395df40-d086-40a3-8f00-190870297e9f\") " Apr 16 20:21:53.909363 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.909341 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5395df40-d086-40a3-8f00-190870297e9f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5395df40-d086-40a3-8f00-190870297e9f" (UID: "5395df40-d086-40a3-8f00-190870297e9f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:53.922829 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:53.922757 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:21:54.009860 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.009820 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/432f4206-43c5-40ef-abd2-fd526131a888-kserve-provision-location\") pod \"432f4206-43c5-40ef-abd2-fd526131a888\" (UID: \"432f4206-43c5-40ef-abd2-fd526131a888\") " Apr 16 20:21:54.010024 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.010015 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5395df40-d086-40a3-8f00-190870297e9f-kserve-provision-location\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:21:54.010187 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.010164 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432f4206-43c5-40ef-abd2-fd526131a888-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "432f4206-43c5-40ef-abd2-fd526131a888" (UID: "432f4206-43c5-40ef-abd2-fd526131a888"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:54.110758 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.110717 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/432f4206-43c5-40ef-abd2-fd526131a888-kserve-provision-location\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:21:54.370250 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.370217 2574 generic.go:358] "Generic (PLEG): container finished" podID="5395df40-d086-40a3-8f00-190870297e9f" containerID="ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb" exitCode=0 Apr 16 20:21:54.370436 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.370290 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" Apr 16 20:21:54.370436 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.370296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" event={"ID":"5395df40-d086-40a3-8f00-190870297e9f","Type":"ContainerDied","Data":"ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb"} Apr 16 20:21:54.370436 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.370328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6" event={"ID":"5395df40-d086-40a3-8f00-190870297e9f","Type":"ContainerDied","Data":"01145f4f6d9befb9ad3952a7ccdfb148a15a974bdc3c623bb775641e6e648146"} Apr 16 20:21:54.370436 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.370343 2574 scope.go:117] "RemoveContainer" containerID="ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb" Apr 16 20:21:54.371678 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.371653 2574 generic.go:358] "Generic (PLEG): container finished" podID="432f4206-43c5-40ef-abd2-fd526131a888" containerID="51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6" exitCode=0 Apr 16 20:21:54.371774 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.371725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" event={"ID":"432f4206-43c5-40ef-abd2-fd526131a888","Type":"ContainerDied","Data":"51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6"} Apr 16 20:21:54.371774 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.371731 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" Apr 16 20:21:54.371774 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.371748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn" event={"ID":"432f4206-43c5-40ef-abd2-fd526131a888","Type":"ContainerDied","Data":"304c93107c0a92dced8e6da8b150ba9cbdd1ba1f7f64bd2dcd6f63853da76992"} Apr 16 20:21:54.378823 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.378802 2574 scope.go:117] "RemoveContainer" containerID="1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1" Apr 16 20:21:54.386366 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.386348 2574 scope.go:117] "RemoveContainer" containerID="ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb" Apr 16 20:21:54.386642 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:54.386623 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb\": container with ID starting with ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb not found: ID does not exist" containerID="ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb" Apr 16 20:21:54.386702 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.386650 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb"} err="failed to get container status \"ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb\": rpc error: code = NotFound desc = could not find container \"ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb\": container with ID starting with ddff52a031770114e37dca841bbe6c7032c5e5b5a4365ea0001d204096c4d3cb not found: ID does not exist" Apr 16 20:21:54.386702 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.386667 2574 scope.go:117] "RemoveContainer" containerID="1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1" Apr 16 20:21:54.386925 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:54.386901 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1\": container with ID starting with 1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1 not found: ID does not exist" containerID="1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1" Apr 16 20:21:54.386970 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.386929 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1"} err="failed to get container status \"1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1\": rpc error: code = NotFound desc = could not find container \"1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1\": container with ID starting with 1d0dc7207f7653c7b37bb79b9190b3a72272f6ea4472b3c5687e553f59bf97b1 not found: ID does not exist" Apr 16 20:21:54.386970 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.386942 2574 scope.go:117] "RemoveContainer" containerID="51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6" Apr 16 20:21:54.393068 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.393043 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn"] Apr 16 20:21:54.394193 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.394174 2574 scope.go:117] "RemoveContainer" containerID="9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d" Apr 16 20:21:54.398756 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.398730 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-lxkhn"] Apr 16 20:21:54.401777 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.401760 2574 scope.go:117] "RemoveContainer" containerID="51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6" Apr 16 20:21:54.402038 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:54.402020 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6\": container with ID starting with 51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6 not found: ID does not exist" containerID="51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6" Apr 16 20:21:54.402088 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.402047 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6"} err="failed to get container status \"51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6\": rpc error: code = NotFound desc = could not find container \"51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6\": container with ID starting with 51f415d4a62088fb603b5591972c9404ad620c1170dcd70ab6057646f365cbd6 not found: ID does not exist" Apr 16 20:21:54.402088 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.402064 2574 scope.go:117] "RemoveContainer" containerID="9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d" Apr 16 20:21:54.402304 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:21:54.402290 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d\": container with ID starting with 9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d not found: ID does not exist" containerID="9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d" Apr 16 20:21:54.402346 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.402307 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d"} err="failed to get container status \"9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d\": rpc error: code = NotFound desc = could not find container \"9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d\": container with ID starting with 9ecbf5c7cd25e26accf5bd3b72944e37458061f8e8e20f88108b53fc0e6e205d not found: ID does not exist" Apr 16 20:21:54.407841 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.407821 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6"] Apr 16 20:21:54.413615 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:54.413593 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-v4xj6"] Apr 16 20:21:55.397411 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:55.397371 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432f4206-43c5-40ef-abd2-fd526131a888" path="/var/lib/kubelet/pods/432f4206-43c5-40ef-abd2-fd526131a888/volumes" Apr 16 20:21:55.397940 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:55.397919 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5395df40-d086-40a3-8f00-190870297e9f" path="/var/lib/kubelet/pods/5395df40-d086-40a3-8f00-190870297e9f/volumes" Apr 16 20:21:56.332361 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:21:56.332322 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:01.332684 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:01.332649 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:01.333071 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:01.332756 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:22:06.332833 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:06.332748 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:11.332741 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:11.332705 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:13.370025 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.369989 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c"] Apr 16 20:22:13.370491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370439 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" Apr 16 20:22:13.370491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370457 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" Apr 16 20:22:13.370491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370472 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="storage-initializer" Apr 16 20:22:13.370491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370480 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="storage-initializer" Apr 16 20:22:13.370491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370490 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="storage-initializer" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370500 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="storage-initializer" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370509 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370527 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370537 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370545 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370565 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="storage-initializer" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370573 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="storage-initializer" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370649 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1c4a0e5-d213-4def-9f4e-188d05a7e679" containerName="kserve-container" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370662 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5395df40-d086-40a3-8f00-190870297e9f" containerName="kserve-container" Apr 16 20:22:13.370749 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.370674 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="432f4206-43c5-40ef-abd2-fd526131a888" containerName="kserve-container" Apr 16 20:22:13.373750 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.373730 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:13.375765 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.375740 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-15788-serving-cert\"" Apr 16 20:22:13.375889 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.375753 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-15788-kube-rbac-proxy-sar-config\"" Apr 16 20:22:13.383931 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.383152 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c"] Apr 16 20:22:13.471355 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.471318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34d9759-f7dc-43db-af7c-9b9c133cd98b-openshift-service-ca-bundle\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:13.471547 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.471442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:13.572118 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.572072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34d9759-f7dc-43db-af7c-9b9c133cd98b-openshift-service-ca-bundle\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:13.572266 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.572175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:13.572305 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:22:13.572288 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-15788-serving-cert: secret "switch-graph-15788-serving-cert" not found Apr 16 20:22:13.572372 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:22:13.572361 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls podName:b34d9759-f7dc-43db-af7c-9b9c133cd98b nodeName:}" failed. No retries permitted until 2026-04-16 20:22:14.072336848 +0000 UTC m=+631.247405997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls") pod "switch-graph-15788-6fd7bd5656-twm8c" (UID: "b34d9759-f7dc-43db-af7c-9b9c133cd98b") : secret "switch-graph-15788-serving-cert" not found Apr 16 20:22:13.572837 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:13.572818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34d9759-f7dc-43db-af7c-9b9c133cd98b-openshift-service-ca-bundle\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:14.075448 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:14.075409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:14.077970 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:14.077940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls\") pod \"switch-graph-15788-6fd7bd5656-twm8c\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:14.289387 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:14.289349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:14.409806 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:14.409780 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c"] Apr 16 20:22:14.412430 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:22:14.412405 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb34d9759_f7dc_43db_af7c_9b9c133cd98b.slice/crio-882714c118978200ec29e4b1c93f2f57e405f7c8275fe23d9dbe5c8c37357e10 WatchSource:0}: Error finding container 882714c118978200ec29e4b1c93f2f57e405f7c8275fe23d9dbe5c8c37357e10: Status 404 returned error can't find the container with id 882714c118978200ec29e4b1c93f2f57e405f7c8275fe23d9dbe5c8c37357e10 Apr 16 20:22:14.427333 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:14.427296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" event={"ID":"b34d9759-f7dc-43db-af7c-9b9c133cd98b","Type":"ContainerStarted","Data":"882714c118978200ec29e4b1c93f2f57e405f7c8275fe23d9dbe5c8c37357e10"} Apr 16 20:22:15.431344 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:15.431305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" event={"ID":"b34d9759-f7dc-43db-af7c-9b9c133cd98b","Type":"ContainerStarted","Data":"a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31"} Apr 16 20:22:15.431834 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:15.431401 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:15.446968 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:15.446918 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podStartSLOduration=2.446902956 podStartE2EDuration="2.446902956s" podCreationTimestamp="2026-04-16 20:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:22:15.445625001 +0000 UTC m=+632.620694169" watchObservedRunningTime="2026-04-16 20:22:15.446902956 +0000 UTC m=+632.621972124" Apr 16 20:22:16.331981 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:16.331942 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:19.229122 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.229089 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:22:19.316288 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.316256 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865e0095-0f83-4e87-b671-e40cf20e1399-proxy-tls\") pod \"865e0095-0f83-4e87-b671-e40cf20e1399\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " Apr 16 20:22:19.316485 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.316304 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865e0095-0f83-4e87-b671-e40cf20e1399-openshift-service-ca-bundle\") pod \"865e0095-0f83-4e87-b671-e40cf20e1399\" (UID: \"865e0095-0f83-4e87-b671-e40cf20e1399\") " Apr 16 20:22:19.316684 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.316656 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865e0095-0f83-4e87-b671-e40cf20e1399-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "865e0095-0f83-4e87-b671-e40cf20e1399" (UID: "865e0095-0f83-4e87-b671-e40cf20e1399"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:22:19.318380 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.318355 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865e0095-0f83-4e87-b671-e40cf20e1399-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "865e0095-0f83-4e87-b671-e40cf20e1399" (UID: "865e0095-0f83-4e87-b671-e40cf20e1399"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:22:19.417075 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.417037 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865e0095-0f83-4e87-b671-e40cf20e1399-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:22:19.417075 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.417067 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865e0095-0f83-4e87-b671-e40cf20e1399-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:22:19.442722 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.442693 2574 generic.go:358] "Generic (PLEG): container finished" podID="865e0095-0f83-4e87-b671-e40cf20e1399" containerID="0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1" exitCode=0 Apr 16 20:22:19.442884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.442755 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" Apr 16 20:22:19.442884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.442758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" event={"ID":"865e0095-0f83-4e87-b671-e40cf20e1399","Type":"ContainerDied","Data":"0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1"} Apr 16 20:22:19.442884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.442786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g" event={"ID":"865e0095-0f83-4e87-b671-e40cf20e1399","Type":"ContainerDied","Data":"8a425eae2ffee1d974e81856e3fbe00a3e4851029b40cfe2a008cd00ee1fbdc2"} Apr 16 20:22:19.442884 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.442801 2574 scope.go:117] "RemoveContainer" containerID="0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1" Apr 16 20:22:19.454600 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.454581 2574 scope.go:117] "RemoveContainer" containerID="0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1" Apr 16 20:22:19.454888 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:22:19.454868 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1\": container with ID starting with 0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1 not found: ID does not exist" containerID="0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1" Apr 16 20:22:19.454936 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.454897 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1"} err="failed to get container status \"0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1\": rpc error: code = NotFound desc = could not find container \"0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1\": container with ID starting with 0b59081868e9bc58f44105356e06d123dbd325eb91c1b5a2a105846f134083f1 not found: ID does not exist" Apr 16 20:22:19.463912 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.463885 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g"] Apr 16 20:22:19.466979 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:19.466955 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5ddf86697d-c728g"] Apr 16 20:22:21.397278 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:21.397244 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" path="/var/lib/kubelet/pods/865e0095-0f83-4e87-b671-e40cf20e1399/volumes" Apr 16 20:22:21.440300 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:21.440274 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:22:59.226215 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.226180 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g"] Apr 16 20:22:59.226680 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.226523 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" Apr 16 20:22:59.226680 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.226535 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" Apr 16 20:22:59.226680 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.226585 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="865e0095-0f83-4e87-b671-e40cf20e1399" containerName="model-chainer" Apr 16 20:22:59.230723 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.230705 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.232959 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.232930 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f682f-serving-cert\"" Apr 16 20:22:59.232959 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.232931 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f682f-kube-rbac-proxy-sar-config\"" Apr 16 20:22:59.236687 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.236661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g"] Apr 16 20:22:59.333628 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.333590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ac9c2-b766-4ecd-a08b-8a8972c30434-openshift-service-ca-bundle\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.333628 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.333631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.434486 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.434441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ac9c2-b766-4ecd-a08b-8a8972c30434-openshift-service-ca-bundle\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.434670 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.434496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.434670 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:22:59.434622 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-f682f-serving-cert: secret "sequence-graph-f682f-serving-cert" not found Apr 16 20:22:59.434759 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:22:59.434677 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls podName:969ac9c2-b766-4ecd-a08b-8a8972c30434 nodeName:}" failed. No retries permitted until 2026-04-16 20:22:59.934660581 +0000 UTC m=+677.109729728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls") pod "sequence-graph-f682f-6fb54f4984-mgr7g" (UID: "969ac9c2-b766-4ecd-a08b-8a8972c30434") : secret "sequence-graph-f682f-serving-cert" not found Apr 16 20:22:59.435172 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.435153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ac9c2-b766-4ecd-a08b-8a8972c30434-openshift-service-ca-bundle\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.939297 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.939259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:22:59.941681 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:22:59.941652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls\") pod \"sequence-graph-f682f-6fb54f4984-mgr7g\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:23:00.141754 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.141718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:23:00.267739 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.267711 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g"] Apr 16 20:23:00.270176 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:23:00.270144 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969ac9c2_b766_4ecd_a08b_8a8972c30434.slice/crio-a0a95d204cdb6c7259e350b0627403738e13e44b5c7220d091df7bbf12f2e9b0 WatchSource:0}: Error finding container a0a95d204cdb6c7259e350b0627403738e13e44b5c7220d091df7bbf12f2e9b0: Status 404 returned error can't find the container with id a0a95d204cdb6c7259e350b0627403738e13e44b5c7220d091df7bbf12f2e9b0 Apr 16 20:23:00.272159 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.272141 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:23:00.562174 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.562131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" event={"ID":"969ac9c2-b766-4ecd-a08b-8a8972c30434","Type":"ContainerStarted","Data":"6e2770cf631ca88239ed43e86747fc8eb8f31d5449b0d338552f74f6ce654d58"} Apr 16 20:23:00.562174 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.562178 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" event={"ID":"969ac9c2-b766-4ecd-a08b-8a8972c30434","Type":"ContainerStarted","Data":"a0a95d204cdb6c7259e350b0627403738e13e44b5c7220d091df7bbf12f2e9b0"} Apr 16 20:23:00.562402 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.562208 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:23:00.582945 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:00.582889 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podStartSLOduration=1.582873338 podStartE2EDuration="1.582873338s" podCreationTimestamp="2026-04-16 20:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:00.581677956 +0000 UTC m=+677.756747125" watchObservedRunningTime="2026-04-16 20:23:00.582873338 +0000 UTC m=+677.757942553" Apr 16 20:23:06.571179 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:23:06.571143 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:30:28.130352 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:28.130321 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c"] Apr 16 20:30:28.133320 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:28.130625 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" containerID="cri-o://a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31" gracePeriod=30 Apr 16 20:30:31.438493 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:31.438455 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:36.437823 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:36.437787 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:41.437987 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:41.437946 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:41.438458 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:41.438040 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:30:46.437866 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:46.437832 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:51.438049 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:51.438012 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:56.438643 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:56.438608 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:30:58.266850 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.266825 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:30:58.299036 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.299005 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls\") pod \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " Apr 16 20:30:58.299212 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.299090 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34d9759-f7dc-43db-af7c-9b9c133cd98b-openshift-service-ca-bundle\") pod \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\" (UID: \"b34d9759-f7dc-43db-af7c-9b9c133cd98b\") " Apr 16 20:30:58.299400 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.299380 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34d9759-f7dc-43db-af7c-9b9c133cd98b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b34d9759-f7dc-43db-af7c-9b9c133cd98b" (UID: "b34d9759-f7dc-43db-af7c-9b9c133cd98b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:30:58.300986 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.300958 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b34d9759-f7dc-43db-af7c-9b9c133cd98b" (UID: "b34d9759-f7dc-43db-af7c-9b9c133cd98b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:30:58.400485 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.400405 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b34d9759-f7dc-43db-af7c-9b9c133cd98b-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:30:58.400485 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.400432 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34d9759-f7dc-43db-af7c-9b9c133cd98b-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:30:58.901410 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.901375 2574 generic.go:358] "Generic (PLEG): container finished" podID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerID="a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31" exitCode=0 Apr 16 20:30:58.901593 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.901448 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" Apr 16 20:30:58.901593 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.901455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" event={"ID":"b34d9759-f7dc-43db-af7c-9b9c133cd98b","Type":"ContainerDied","Data":"a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31"} Apr 16 20:30:58.901593 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.901493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c" event={"ID":"b34d9759-f7dc-43db-af7c-9b9c133cd98b","Type":"ContainerDied","Data":"882714c118978200ec29e4b1c93f2f57e405f7c8275fe23d9dbe5c8c37357e10"} Apr 16 20:30:58.901593 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.901508 2574 scope.go:117] "RemoveContainer" containerID="a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31" Apr 16 20:30:58.909228 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.909205 2574 scope.go:117] "RemoveContainer" containerID="a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31" Apr 16 20:30:58.909494 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:30:58.909473 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31\": container with ID starting with a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31 not found: ID does not exist" containerID="a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31" Apr 16 20:30:58.909562 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.909501 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31"} err="failed to get container status \"a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31\": rpc error: code = NotFound desc = could not find container \"a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31\": container with ID starting with a444435182c596c7b1419bcc55274087557e3777d3c08ee84c432d116d7d3d31 not found: ID does not exist" Apr 16 20:30:58.921264 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.921240 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c"] Apr 16 20:30:58.924189 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:58.924168 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-15788-6fd7bd5656-twm8c"] Apr 16 20:30:59.397197 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:30:59.397162 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" path="/var/lib/kubelet/pods/b34d9759-f7dc-43db-af7c-9b9c133cd98b/volumes" Apr 16 20:31:13.990589 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:13.990514 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g"] Apr 16 20:31:13.990942 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:13.990775 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" containerID="cri-o://6e2770cf631ca88239ed43e86747fc8eb8f31d5449b0d338552f74f6ce654d58" gracePeriod=30 Apr 16 20:31:16.569674 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:16.569637 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:21.569517 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:21.569473 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:26.569064 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:26.569024 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:26.569492 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:26.569166 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:31:31.568759 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:31.568721 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:36.569090 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:36.569054 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:38.365405 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.365326 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl"] Apr 16 20:31:38.365735 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.365642 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" Apr 16 20:31:38.365735 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.365655 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" Apr 16 20:31:38.365735 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.365707 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b34d9759-f7dc-43db-af7c-9b9c133cd98b" containerName="switch-graph-15788" Apr 16 20:31:38.368524 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.368508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.370937 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.370913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-8eac5-serving-cert\"" Apr 16 20:31:38.371040 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.370965 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-8eac5-kube-rbac-proxy-sar-config\"" Apr 16 20:31:38.377596 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.377570 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl"] Apr 16 20:31:38.526200 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.526160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd521b28-28bc-432c-b6d6-de080097f1f9-openshift-service-ca-bundle\") pod \"ensemble-graph-8eac5-6b5d7df8f4-l6vjl\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.526200 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.526206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd521b28-28bc-432c-b6d6-de080097f1f9-proxy-tls\") pod \"ensemble-graph-8eac5-6b5d7df8f4-l6vjl\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.627271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.627189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd521b28-28bc-432c-b6d6-de080097f1f9-openshift-service-ca-bundle\") pod \"ensemble-graph-8eac5-6b5d7df8f4-l6vjl\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.627271 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.627228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd521b28-28bc-432c-b6d6-de080097f1f9-proxy-tls\") pod \"ensemble-graph-8eac5-6b5d7df8f4-l6vjl\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.627809 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.627772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd521b28-28bc-432c-b6d6-de080097f1f9-openshift-service-ca-bundle\") pod \"ensemble-graph-8eac5-6b5d7df8f4-l6vjl\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.629633 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.629607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd521b28-28bc-432c-b6d6-de080097f1f9-proxy-tls\") pod \"ensemble-graph-8eac5-6b5d7df8f4-l6vjl\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.679127 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.679067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:38.795233 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.795209 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl"] Apr 16 20:31:38.797097 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:31:38.797070 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd521b28_28bc_432c_b6d6_de080097f1f9.slice/crio-15ed1b5e810185fd0aec720da6dac2af0cee64fe90ff2704e1f2249e679f77a1 WatchSource:0}: Error finding container 15ed1b5e810185fd0aec720da6dac2af0cee64fe90ff2704e1f2249e679f77a1: Status 404 returned error can't find the container with id 15ed1b5e810185fd0aec720da6dac2af0cee64fe90ff2704e1f2249e679f77a1 Apr 16 20:31:38.798849 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:38.798832 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:31:39.011324 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:39.011282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" event={"ID":"fd521b28-28bc-432c-b6d6-de080097f1f9","Type":"ContainerStarted","Data":"44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e"} Apr 16 20:31:39.011324 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:39.011319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" event={"ID":"fd521b28-28bc-432c-b6d6-de080097f1f9","Type":"ContainerStarted","Data":"15ed1b5e810185fd0aec720da6dac2af0cee64fe90ff2704e1f2249e679f77a1"} Apr 16 20:31:39.011627 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:39.011406 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:39.027434 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:39.027389 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podStartSLOduration=1.02735074 podStartE2EDuration="1.02735074s" podCreationTimestamp="2026-04-16 20:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:39.025988343 +0000 UTC m=+1196.201057510" watchObservedRunningTime="2026-04-16 20:31:39.02735074 +0000 UTC m=+1196.202419907" Apr 16 20:31:41.568836 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:41.568802 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:44.027044 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.027011 2574 generic.go:358] "Generic (PLEG): container finished" podID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerID="6e2770cf631ca88239ed43e86747fc8eb8f31d5449b0d338552f74f6ce654d58" exitCode=0 Apr 16 20:31:44.027485 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.027059 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" event={"ID":"969ac9c2-b766-4ecd-a08b-8a8972c30434","Type":"ContainerDied","Data":"6e2770cf631ca88239ed43e86747fc8eb8f31d5449b0d338552f74f6ce654d58"} Apr 16 20:31:44.128290 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.128267 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:31:44.175506 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.175474 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls\") pod \"969ac9c2-b766-4ecd-a08b-8a8972c30434\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " Apr 16 20:31:44.175678 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.175535 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ac9c2-b766-4ecd-a08b-8a8972c30434-openshift-service-ca-bundle\") pod \"969ac9c2-b766-4ecd-a08b-8a8972c30434\" (UID: \"969ac9c2-b766-4ecd-a08b-8a8972c30434\") " Apr 16 20:31:44.175897 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.175866 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969ac9c2-b766-4ecd-a08b-8a8972c30434-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "969ac9c2-b766-4ecd-a08b-8a8972c30434" (UID: "969ac9c2-b766-4ecd-a08b-8a8972c30434"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:44.177613 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.177591 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "969ac9c2-b766-4ecd-a08b-8a8972c30434" (UID: "969ac9c2-b766-4ecd-a08b-8a8972c30434"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:44.276435 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.276332 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ac9c2-b766-4ecd-a08b-8a8972c30434-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:31:44.276435 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:44.276379 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/969ac9c2-b766-4ecd-a08b-8a8972c30434-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:31:45.018969 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.018938 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:31:45.030899 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.030868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" event={"ID":"969ac9c2-b766-4ecd-a08b-8a8972c30434","Type":"ContainerDied","Data":"a0a95d204cdb6c7259e350b0627403738e13e44b5c7220d091df7bbf12f2e9b0"} Apr 16 20:31:45.031269 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.030906 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g" Apr 16 20:31:45.031269 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.030913 2574 scope.go:117] "RemoveContainer" containerID="6e2770cf631ca88239ed43e86747fc8eb8f31d5449b0d338552f74f6ce654d58" Apr 16 20:31:45.057572 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.057541 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g"] Apr 16 20:31:45.063231 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.063207 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f682f-6fb54f4984-mgr7g"] Apr 16 20:31:45.396956 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:45.396879 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" path="/var/lib/kubelet/pods/969ac9c2-b766-4ecd-a08b-8a8972c30434/volumes" Apr 16 20:31:48.435504 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:48.435473 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl"] Apr 16 20:31:48.435898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:48.435673 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" containerID="cri-o://44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e" gracePeriod=30 Apr 16 20:31:50.018224 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:50.018182 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:31:55.017764 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:31:55.017725 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:00.017814 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:00.017770 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:00.018248 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:00.017886 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:32:05.017582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:05.017545 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:10.018054 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:10.018012 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:15.017614 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:15.017560 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:18.574874 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.574850 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:32:18.653155 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.653103 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd521b28-28bc-432c-b6d6-de080097f1f9-proxy-tls\") pod \"fd521b28-28bc-432c-b6d6-de080097f1f9\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " Apr 16 20:32:18.653338 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.653214 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd521b28-28bc-432c-b6d6-de080097f1f9-openshift-service-ca-bundle\") pod \"fd521b28-28bc-432c-b6d6-de080097f1f9\" (UID: \"fd521b28-28bc-432c-b6d6-de080097f1f9\") " Apr 16 20:32:18.653529 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.653506 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd521b28-28bc-432c-b6d6-de080097f1f9-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fd521b28-28bc-432c-b6d6-de080097f1f9" (UID: "fd521b28-28bc-432c-b6d6-de080097f1f9"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:32:18.655097 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.655078 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd521b28-28bc-432c-b6d6-de080097f1f9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fd521b28-28bc-432c-b6d6-de080097f1f9" (UID: "fd521b28-28bc-432c-b6d6-de080097f1f9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:32:18.754690 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.754653 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd521b28-28bc-432c-b6d6-de080097f1f9-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:32:18.754690 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:18.754684 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd521b28-28bc-432c-b6d6-de080097f1f9-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:32:19.124445 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.124359 2574 generic.go:358] "Generic (PLEG): container finished" podID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerID="44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e" exitCode=0 Apr 16 20:32:19.124445 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.124404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" event={"ID":"fd521b28-28bc-432c-b6d6-de080097f1f9","Type":"ContainerDied","Data":"44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e"} Apr 16 20:32:19.124445 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.124431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" event={"ID":"fd521b28-28bc-432c-b6d6-de080097f1f9","Type":"ContainerDied","Data":"15ed1b5e810185fd0aec720da6dac2af0cee64fe90ff2704e1f2249e679f77a1"} Apr 16 20:32:19.124445 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.124442 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl" Apr 16 20:32:19.124726 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.124452 2574 scope.go:117] "RemoveContainer" containerID="44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e" Apr 16 20:32:19.132517 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.132496 2574 scope.go:117] "RemoveContainer" containerID="44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e" Apr 16 20:32:19.132754 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:32:19.132730 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e\": container with ID starting with 44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e not found: ID does not exist" containerID="44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e" Apr 16 20:32:19.132820 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.132761 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e"} err="failed to get container status \"44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e\": rpc error: code = NotFound desc = could not find container \"44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e\": container with ID starting with 44bba595eda4e79f0279bad4fc04b65a3c6ef46b41182e137d95accc6705e03e not found: ID does not exist" Apr 16 20:32:19.144505 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.144479 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl"] Apr 16 20:32:19.146561 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.146541 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-8eac5-6b5d7df8f4-l6vjl"] Apr 16 20:32:19.396479 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:19.396401 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" path="/var/lib/kubelet/pods/fd521b28-28bc-432c-b6d6-de080097f1f9/volumes" Apr 16 20:32:24.162729 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.162693 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8"] Apr 16 20:32:24.163144 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.162985 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" Apr 16 20:32:24.163144 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.162997 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" Apr 16 20:32:24.163144 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.163021 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" Apr 16 20:32:24.163144 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.163027 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" Apr 16 20:32:24.163144 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.163071 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="969ac9c2-b766-4ecd-a08b-8a8972c30434" containerName="sequence-graph-f682f" Apr 16 20:32:24.163144 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.163081 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd521b28-28bc-432c-b6d6-de080097f1f9" containerName="ensemble-graph-8eac5" Apr 16 20:32:24.167257 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.167239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.169744 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.169723 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-01ef2-kube-rbac-proxy-sar-config\"" Apr 16 20:32:24.169744 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.169742 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-01ef2-serving-cert\"" Apr 16 20:32:24.170678 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.170654 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5hj24\"" Apr 16 20:32:24.170820 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.170654 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:32:24.173017 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.172998 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8"] Apr 16 20:32:24.301408 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.301378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-proxy-tls\") pod \"sequence-graph-01ef2-997d6cfb-pfzw8\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.301566 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.301433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-openshift-service-ca-bundle\") pod \"sequence-graph-01ef2-997d6cfb-pfzw8\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.402501 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.402459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-proxy-tls\") pod \"sequence-graph-01ef2-997d6cfb-pfzw8\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.402677 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.402523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-openshift-service-ca-bundle\") pod \"sequence-graph-01ef2-997d6cfb-pfzw8\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.403188 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.403164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-openshift-service-ca-bundle\") pod \"sequence-graph-01ef2-997d6cfb-pfzw8\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.404803 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.404782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-proxy-tls\") pod \"sequence-graph-01ef2-997d6cfb-pfzw8\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.477761 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.477685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:24.599942 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:24.599913 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8"] Apr 16 20:32:24.602455 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:32:24.602429 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9d7014_96f4_48dc_8c3f_3cab21a9c0bc.slice/crio-404a6f57673c49612d0d7e7b3c2192233ef2893fef0dbd52d62df9e3596b0c8e WatchSource:0}: Error finding container 404a6f57673c49612d0d7e7b3c2192233ef2893fef0dbd52d62df9e3596b0c8e: Status 404 returned error can't find the container with id 404a6f57673c49612d0d7e7b3c2192233ef2893fef0dbd52d62df9e3596b0c8e Apr 16 20:32:25.141237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:25.141199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" event={"ID":"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc","Type":"ContainerStarted","Data":"054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31"} Apr 16 20:32:25.141237 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:25.141235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" event={"ID":"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc","Type":"ContainerStarted","Data":"404a6f57673c49612d0d7e7b3c2192233ef2893fef0dbd52d62df9e3596b0c8e"} Apr 16 20:32:25.141469 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:25.141333 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:25.157795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:25.157741 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podStartSLOduration=1.157723392 podStartE2EDuration="1.157723392s" podCreationTimestamp="2026-04-16 20:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:32:25.15733032 +0000 UTC m=+1242.332399485" watchObservedRunningTime="2026-04-16 20:32:25.157723392 +0000 UTC m=+1242.332792559" Apr 16 20:32:31.150704 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:31.150673 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:34.255591 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:34.255511 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8"] Apr 16 20:32:34.255974 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:34.255730 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" containerID="cri-o://054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31" gracePeriod=30 Apr 16 20:32:36.148343 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:36.148302 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:41.147725 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:41.147685 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:46.148494 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:46.148452 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:46.148896 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:46.148561 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:32:51.147813 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:51.147771 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:56.148297 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:56.148258 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:32:58.658117 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.658077 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9"] Apr 16 20:32:58.662441 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.662424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.664689 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.664663 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-b8212-kube-rbac-proxy-sar-config\"" Apr 16 20:32:58.664794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.664684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-b8212-serving-cert\"" Apr 16 20:32:58.669206 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.669184 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9"] Apr 16 20:32:58.770073 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.770034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-openshift-service-ca-bundle\") pod \"ensemble-graph-b8212-86c7c564dd-sqwd9\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.770073 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.770076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-proxy-tls\") pod \"ensemble-graph-b8212-86c7c564dd-sqwd9\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.870847 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.870805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-openshift-service-ca-bundle\") pod \"ensemble-graph-b8212-86c7c564dd-sqwd9\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.870847 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.870849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-proxy-tls\") pod \"ensemble-graph-b8212-86c7c564dd-sqwd9\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.871533 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.871509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-openshift-service-ca-bundle\") pod \"ensemble-graph-b8212-86c7c564dd-sqwd9\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.873231 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.873213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-proxy-tls\") pod \"ensemble-graph-b8212-86c7c564dd-sqwd9\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:58.973487 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:58.973399 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:59.085905 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:59.085872 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9"] Apr 16 20:32:59.239931 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:59.239897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" event={"ID":"72babbbf-c5c7-4a4a-bc2e-41cc59132d48","Type":"ContainerStarted","Data":"4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54"} Apr 16 20:32:59.239931 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:59.239934 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" event={"ID":"72babbbf-c5c7-4a4a-bc2e-41cc59132d48","Type":"ContainerStarted","Data":"0dea46ef13c814e93558e3fdb0a0b5730ffa3d8e70a2f77df6cc4ec46e4055e1"} Apr 16 20:32:59.240152 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:59.240005 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:32:59.256210 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:32:59.256159 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podStartSLOduration=1.256140454 podStartE2EDuration="1.256140454s" podCreationTimestamp="2026-04-16 20:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:32:59.25505273 +0000 UTC m=+1276.430121909" watchObservedRunningTime="2026-04-16 20:32:59.256140454 +0000 UTC m=+1276.431209621" Apr 16 20:33:01.147795 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:01.147759 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:04.283086 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:33:04.283050 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9d7014_96f4_48dc_8c3f_3cab21a9c0bc.slice/crio-404a6f57673c49612d0d7e7b3c2192233ef2893fef0dbd52d62df9e3596b0c8e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9d7014_96f4_48dc_8c3f_3cab21a9c0bc.slice/crio-054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:33:04.283480 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:33:04.283277 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a9d7014_96f4_48dc_8c3f_3cab21a9c0bc.slice/crio-conmon-054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:33:04.392193 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.392169 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:33:04.516171 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.516087 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-proxy-tls\") pod \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " Apr 16 20:33:04.516317 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.516179 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-openshift-service-ca-bundle\") pod \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\" (UID: \"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc\") " Apr 16 20:33:04.516556 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.516523 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" (UID: "7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:33:04.518149 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.518126 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" (UID: "7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:33:04.617248 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.617214 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:33:04.617248 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:04.617241 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:33:05.249487 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.249457 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:33:05.258604 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.258578 2574 generic.go:358] "Generic (PLEG): container finished" podID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerID="054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31" exitCode=0 Apr 16 20:33:05.258748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.258631 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" Apr 16 20:33:05.258748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.258651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" event={"ID":"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc","Type":"ContainerDied","Data":"054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31"} Apr 16 20:33:05.258748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.258686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8" event={"ID":"7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc","Type":"ContainerDied","Data":"404a6f57673c49612d0d7e7b3c2192233ef2893fef0dbd52d62df9e3596b0c8e"} Apr 16 20:33:05.258748 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.258701 2574 scope.go:117] "RemoveContainer" containerID="054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31" Apr 16 20:33:05.267090 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.266897 2574 scope.go:117] "RemoveContainer" containerID="054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31" Apr 16 20:33:05.267247 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:33:05.267214 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31\": container with ID starting with 054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31 not found: ID does not exist" containerID="054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31" Apr 16 20:33:05.267247 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.267237 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31"} err="failed to get container status \"054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31\": rpc error: code = NotFound desc = could not find container \"054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31\": container with ID starting with 054db9233ec370a5cf787e3f4f5e4afcf40aa9b3cbeabeb6f79327d750789b31 not found: ID does not exist" Apr 16 20:33:05.286360 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.286333 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8"] Apr 16 20:33:05.289665 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.289641 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-01ef2-997d6cfb-pfzw8"] Apr 16 20:33:05.396652 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:05.396623 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" path="/var/lib/kubelet/pods/7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc/volumes" Apr 16 20:33:44.477638 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.477602 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk"] Apr 16 20:33:44.478063 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.477933 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" Apr 16 20:33:44.478063 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.477948 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" Apr 16 20:33:44.478063 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.478014 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a9d7014-96f4-48dc-8c3f-3cab21a9c0bc" containerName="sequence-graph-01ef2" Apr 16 20:33:44.481253 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.481238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:44.483314 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.483284 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cc0f2-kube-rbac-proxy-sar-config\"" Apr 16 20:33:44.483428 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.483407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cc0f2-serving-cert\"" Apr 16 20:33:44.490461 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.490437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk"] Apr 16 20:33:44.636085 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.636048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9efc6c83-9708-4bcc-9af8-760312d60388-openshift-service-ca-bundle\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:44.636297 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.636141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:44.736747 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.736720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:44.736914 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.736778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9efc6c83-9708-4bcc-9af8-760312d60388-openshift-service-ca-bundle\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:44.736914 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:33:44.736850 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-cc0f2-serving-cert: secret "sequence-graph-cc0f2-serving-cert" not found Apr 16 20:33:44.736995 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:33:44.736918 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls podName:9efc6c83-9708-4bcc-9af8-760312d60388 nodeName:}" failed. No retries permitted until 2026-04-16 20:33:45.236902789 +0000 UTC m=+1322.411971939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls") pod "sequence-graph-cc0f2-8b5c9b9f7-vvxvk" (UID: "9efc6c83-9708-4bcc-9af8-760312d60388") : secret "sequence-graph-cc0f2-serving-cert" not found Apr 16 20:33:44.737338 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:44.737322 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9efc6c83-9708-4bcc-9af8-760312d60388-openshift-service-ca-bundle\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:45.242289 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:45.242259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:45.244742 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:45.244709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls\") pod \"sequence-graph-cc0f2-8b5c9b9f7-vvxvk\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:45.391490 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:45.391457 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:45.507012 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:45.506937 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk"] Apr 16 20:33:45.512732 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:33:45.512688 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9efc6c83_9708_4bcc_9af8_760312d60388.slice/crio-9e74fdb2d2874416fad1daa9f2b09e0bb126a8ce42901bfa71b237c0038a2a6f WatchSource:0}: Error finding container 9e74fdb2d2874416fad1daa9f2b09e0bb126a8ce42901bfa71b237c0038a2a6f: Status 404 returned error can't find the container with id 9e74fdb2d2874416fad1daa9f2b09e0bb126a8ce42901bfa71b237c0038a2a6f Apr 16 20:33:46.376977 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:46.376936 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" event={"ID":"9efc6c83-9708-4bcc-9af8-760312d60388","Type":"ContainerStarted","Data":"cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e"} Apr 16 20:33:46.376977 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:46.376972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" event={"ID":"9efc6c83-9708-4bcc-9af8-760312d60388","Type":"ContainerStarted","Data":"9e74fdb2d2874416fad1daa9f2b09e0bb126a8ce42901bfa71b237c0038a2a6f"} Apr 16 20:33:46.377212 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:46.377008 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:33:46.394487 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:46.394442 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podStartSLOduration=2.394428744 podStartE2EDuration="2.394428744s" podCreationTimestamp="2026-04-16 20:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:33:46.393146877 +0000 UTC m=+1323.568216046" watchObservedRunningTime="2026-04-16 20:33:46.394428744 +0000 UTC m=+1323.569497910" Apr 16 20:33:52.385462 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:33:52.385435 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:41:13.337568 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:13.337532 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9"] Apr 16 20:41:13.338143 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:13.337764 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" containerID="cri-o://4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54" gracePeriod=30 Apr 16 20:41:15.248048 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:15.248007 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:20.247070 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:20.247032 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:25.247220 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:25.247181 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:25.247614 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:25.247284 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:41:30.248070 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:30.248028 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:35.248168 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:35.248071 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:40.247376 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:40.247331 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:43.480604 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.480577 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:41:43.542567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.542531 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-openshift-service-ca-bundle\") pod \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " Apr 16 20:41:43.542567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.542568 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-proxy-tls\") pod \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\" (UID: \"72babbbf-c5c7-4a4a-bc2e-41cc59132d48\") " Apr 16 20:41:43.542886 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.542861 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "72babbbf-c5c7-4a4a-bc2e-41cc59132d48" (UID: "72babbbf-c5c7-4a4a-bc2e-41cc59132d48"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:43.544633 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.544608 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "72babbbf-c5c7-4a4a-bc2e-41cc59132d48" (UID: "72babbbf-c5c7-4a4a-bc2e-41cc59132d48"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:41:43.643215 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.643130 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:41:43.643215 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.643165 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72babbbf-c5c7-4a4a-bc2e-41cc59132d48-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:41:43.712410 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.712376 2574 generic.go:358] "Generic (PLEG): container finished" podID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerID="4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54" exitCode=0 Apr 16 20:41:43.712582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.712438 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" Apr 16 20:41:43.712582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.712462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" event={"ID":"72babbbf-c5c7-4a4a-bc2e-41cc59132d48","Type":"ContainerDied","Data":"4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54"} Apr 16 20:41:43.712582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.712495 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9" event={"ID":"72babbbf-c5c7-4a4a-bc2e-41cc59132d48","Type":"ContainerDied","Data":"0dea46ef13c814e93558e3fdb0a0b5730ffa3d8e70a2f77df6cc4ec46e4055e1"} Apr 16 20:41:43.712582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.712510 2574 scope.go:117] "RemoveContainer" containerID="4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54" Apr 16 20:41:43.720039 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.720023 2574 scope.go:117] "RemoveContainer" containerID="4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54" Apr 16 20:41:43.720335 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:41:43.720310 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54\": container with ID starting with 4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54 not found: ID does not exist" containerID="4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54" Apr 16 20:41:43.720431 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.720343 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54"} err="failed to get container status \"4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54\": rpc error: code = NotFound desc = could not find container \"4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54\": container with ID starting with 4917f344a5d020dcebd58bdc8be62eb070ad219e85d7126acfe836b5f5c05c54 not found: ID does not exist" Apr 16 20:41:43.732671 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.732648 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9"] Apr 16 20:41:43.736701 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:43.736679 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-b8212-86c7c564dd-sqwd9"] Apr 16 20:41:45.396966 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:45.396930 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" path="/var/lib/kubelet/pods/72babbbf-c5c7-4a4a-bc2e-41cc59132d48/volumes" Apr 16 20:41:59.142695 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:59.142659 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk"] Apr 16 20:41:59.143082 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:41:59.142908 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" containerID="cri-o://cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e" gracePeriod=30 Apr 16 20:42:02.383886 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:02.383843 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:07.383930 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:07.383891 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:12.383422 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:12.383386 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:12.383803 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:12.383506 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:42:17.384803 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:17.384754 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:22.383302 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:22.383260 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:23.582831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.582755 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8"] Apr 16 20:42:23.583189 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.583060 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" Apr 16 20:42:23.583189 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.583075 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" Apr 16 20:42:23.583189 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.583154 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="72babbbf-c5c7-4a4a-bc2e-41cc59132d48" containerName="ensemble-graph-b8212" Apr 16 20:42:23.585726 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.585704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:23.588241 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.588222 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-932a3-serving-cert\"" Apr 16 20:42:23.588683 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.588667 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-932a3-kube-rbac-proxy-sar-config\"" Apr 16 20:42:23.597084 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.597061 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8"] Apr 16 20:42:23.662586 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.662548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:23.662586 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.662590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-openshift-service-ca-bundle\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:23.763019 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.762983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:23.763227 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.763034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-openshift-service-ca-bundle\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:23.763227 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:42:23.763177 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-932a3-serving-cert: secret "splitter-graph-932a3-serving-cert" not found Apr 16 20:42:23.763313 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:42:23.763252 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls podName:13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3 nodeName:}" failed. No retries permitted until 2026-04-16 20:42:24.263230353 +0000 UTC m=+1841.438299499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls") pod "splitter-graph-932a3-c77946655-t5lk8" (UID: "13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3") : secret "splitter-graph-932a3-serving-cert" not found Apr 16 20:42:23.763750 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:23.763730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-openshift-service-ca-bundle\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:24.266966 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.266912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:24.269309 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.269287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls\") pod \"splitter-graph-932a3-c77946655-t5lk8\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:24.498791 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.498735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:24.621654 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.621494 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8"] Apr 16 20:42:24.624477 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:42:24.624449 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13eaa2f0_15a7_4f1b_88bb_e22910b4e1a3.slice/crio-8f75325ed1dbe43729bd2ef94a6b71ffe90aca9a6f2ecce08b3c2f5fc8526215 WatchSource:0}: Error finding container 8f75325ed1dbe43729bd2ef94a6b71ffe90aca9a6f2ecce08b3c2f5fc8526215: Status 404 returned error can't find the container with id 8f75325ed1dbe43729bd2ef94a6b71ffe90aca9a6f2ecce08b3c2f5fc8526215 Apr 16 20:42:24.626140 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.626122 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:42:24.822318 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.822222 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" event={"ID":"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3","Type":"ContainerStarted","Data":"a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7"} Apr 16 20:42:24.822318 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.822270 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" event={"ID":"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3","Type":"ContainerStarted","Data":"8f75325ed1dbe43729bd2ef94a6b71ffe90aca9a6f2ecce08b3c2f5fc8526215"} Apr 16 20:42:24.822504 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.822360 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:24.839266 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:24.839213 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podStartSLOduration=1.839198645 podStartE2EDuration="1.839198645s" podCreationTimestamp="2026-04-16 20:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:42:24.83780921 +0000 UTC m=+1842.012878376" watchObservedRunningTime="2026-04-16 20:42:24.839198645 +0000 UTC m=+1842.014267812" Apr 16 20:42:27.383491 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:27.383454 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:29.279320 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.279299 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:42:29.304582 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.304548 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls\") pod \"9efc6c83-9708-4bcc-9af8-760312d60388\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " Apr 16 20:42:29.304777 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.304646 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9efc6c83-9708-4bcc-9af8-760312d60388-openshift-service-ca-bundle\") pod \"9efc6c83-9708-4bcc-9af8-760312d60388\" (UID: \"9efc6c83-9708-4bcc-9af8-760312d60388\") " Apr 16 20:42:29.305186 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.305160 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9efc6c83-9708-4bcc-9af8-760312d60388-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9efc6c83-9708-4bcc-9af8-760312d60388" (UID: "9efc6c83-9708-4bcc-9af8-760312d60388"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:29.306831 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.306808 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9efc6c83-9708-4bcc-9af8-760312d60388" (UID: "9efc6c83-9708-4bcc-9af8-760312d60388"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:29.405678 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.405587 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9efc6c83-9708-4bcc-9af8-760312d60388-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:42:29.405678 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.405613 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9efc6c83-9708-4bcc-9af8-760312d60388-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:42:29.836300 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.836262 2574 generic.go:358] "Generic (PLEG): container finished" podID="9efc6c83-9708-4bcc-9af8-760312d60388" containerID="cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e" exitCode=0 Apr 16 20:42:29.836480 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.836341 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" event={"ID":"9efc6c83-9708-4bcc-9af8-760312d60388","Type":"ContainerDied","Data":"cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e"} Apr 16 20:42:29.836480 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.836378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" event={"ID":"9efc6c83-9708-4bcc-9af8-760312d60388","Type":"ContainerDied","Data":"9e74fdb2d2874416fad1daa9f2b09e0bb126a8ce42901bfa71b237c0038a2a6f"} Apr 16 20:42:29.836480 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.836393 2574 scope.go:117] "RemoveContainer" containerID="cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e" Apr 16 20:42:29.836480 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.836350 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk" Apr 16 20:42:29.843760 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.843742 2574 scope.go:117] "RemoveContainer" containerID="cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e" Apr 16 20:42:29.843998 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:42:29.843978 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e\": container with ID starting with cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e not found: ID does not exist" containerID="cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e" Apr 16 20:42:29.844055 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.844008 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e"} err="failed to get container status \"cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e\": rpc error: code = NotFound desc = could not find container \"cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e\": container with ID starting with cff366f436e3e79b60a39d8a6aba5d4eedcbee9e3196c66ebdd62a11e844242e not found: ID does not exist" Apr 16 20:42:29.851809 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.851786 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk"] Apr 16 20:42:29.853730 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:29.853712 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cc0f2-8b5c9b9f7-vvxvk"] Apr 16 20:42:30.830914 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:30.830885 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:31.396381 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:31.396350 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" path="/var/lib/kubelet/pods/9efc6c83-9708-4bcc-9af8-760312d60388/volumes" Apr 16 20:42:33.655897 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:33.655867 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8"] Apr 16 20:42:33.656368 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:33.656060 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" containerID="cri-o://a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7" gracePeriod=30 Apr 16 20:42:35.829609 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:35.829569 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:40.829317 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:40.829273 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:45.829100 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:45.829054 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:45.829478 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:45.829183 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:42:50.829508 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:50.829466 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:42:55.833464 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:42:55.833422 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:43:00.829782 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:00.829743 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:43:03.688235 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:43:03.688198 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13eaa2f0_15a7_4f1b_88bb_e22910b4e1a3.slice/crio-conmon-a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:43:03.790973 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.790946 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:43:03.879990 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.879956 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls\") pod \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " Apr 16 20:43:03.880174 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.880004 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-openshift-service-ca-bundle\") pod \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\" (UID: \"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3\") " Apr 16 20:43:03.880380 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.880351 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" (UID: "13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:43:03.881969 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.881941 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" (UID: "13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:43:03.935567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.935480 2574 generic.go:358] "Generic (PLEG): container finished" podID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerID="a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7" exitCode=0 Apr 16 20:43:03.935567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.935518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" event={"ID":"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3","Type":"ContainerDied","Data":"a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7"} Apr 16 20:43:03.935567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.935541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" event={"ID":"13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3","Type":"ContainerDied","Data":"8f75325ed1dbe43729bd2ef94a6b71ffe90aca9a6f2ecce08b3c2f5fc8526215"} Apr 16 20:43:03.935567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.935546 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8" Apr 16 20:43:03.935567 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.935556 2574 scope.go:117] "RemoveContainer" containerID="a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7" Apr 16 20:43:03.943343 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.943234 2574 scope.go:117] "RemoveContainer" containerID="a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7" Apr 16 20:43:03.943622 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:43:03.943600 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7\": container with ID starting with a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7 not found: ID does not exist" containerID="a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7" Apr 16 20:43:03.943706 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.943633 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7"} err="failed to get container status \"a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7\": rpc error: code = NotFound desc = could not find container \"a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7\": container with ID starting with a4a44ea08b01ce24ecc2acf158ebd8130c3dadda24fe8b8f9c1058449fe7b0f7 not found: ID does not exist" Apr 16 20:43:03.955974 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.955952 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8"] Apr 16 20:43:03.959275 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.959252 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932a3-c77946655-t5lk8"] Apr 16 20:43:03.981312 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.981288 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:43:03.981312 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:03.981311 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:43:05.396901 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:05.396857 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" path="/var/lib/kubelet/pods/13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3/volumes" Apr 16 20:43:09.361529 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361492 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj"] Apr 16 20:43:09.361904 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361791 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" Apr 16 20:43:09.361904 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361803 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" Apr 16 20:43:09.361904 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361825 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" Apr 16 20:43:09.361904 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361831 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" Apr 16 20:43:09.361904 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361885 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9efc6c83-9708-4bcc-9af8-760312d60388" containerName="sequence-graph-cc0f2" Apr 16 20:43:09.361904 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.361896 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="13eaa2f0-15a7-4f1b-88bb-e22910b4e1a3" containerName="splitter-graph-932a3" Apr 16 20:43:09.366136 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.366097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:09.368562 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.368539 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a03d4-kube-rbac-proxy-sar-config\"" Apr 16 20:43:09.368562 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.368557 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a03d4-serving-cert\"" Apr 16 20:43:09.368779 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.368761 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5hj24\"" Apr 16 20:43:09.368832 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.368761 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:43:09.371594 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.371573 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj"] Apr 16 20:43:09.422188 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.422145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d98ff6a5-3a67-4285-94d0-d7659e0c9414-openshift-service-ca-bundle\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:09.422188 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.422191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:09.522614 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.522570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d98ff6a5-3a67-4285-94d0-d7659e0c9414-openshift-service-ca-bundle\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:09.522614 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.522608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:09.522866 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:43:09.522727 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-a03d4-serving-cert: secret "switch-graph-a03d4-serving-cert" not found Apr 16 20:43:09.522866 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:43:09.522793 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls podName:d98ff6a5-3a67-4285-94d0-d7659e0c9414 nodeName:}" failed. No retries permitted until 2026-04-16 20:43:10.022772106 +0000 UTC m=+1887.197841252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls") pod "switch-graph-a03d4-7967858c6b-gzxdj" (UID: "d98ff6a5-3a67-4285-94d0-d7659e0c9414") : secret "switch-graph-a03d4-serving-cert" not found Apr 16 20:43:09.523220 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:09.523198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d98ff6a5-3a67-4285-94d0-d7659e0c9414-openshift-service-ca-bundle\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:10.025252 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.025217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:10.027530 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.027503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls\") pod \"switch-graph-a03d4-7967858c6b-gzxdj\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:10.277689 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.277606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:10.390130 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.390094 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj"] Apr 16 20:43:10.392666 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:43:10.392641 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd98ff6a5_3a67_4285_94d0_d7659e0c9414.slice/crio-7694e4e84edf09badb2e05997c583ab85fa134c3c61d819e8437fbc0ee51f429 WatchSource:0}: Error finding container 7694e4e84edf09badb2e05997c583ab85fa134c3c61d819e8437fbc0ee51f429: Status 404 returned error can't find the container with id 7694e4e84edf09badb2e05997c583ab85fa134c3c61d819e8437fbc0ee51f429 Apr 16 20:43:10.955679 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.955643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" event={"ID":"d98ff6a5-3a67-4285-94d0-d7659e0c9414","Type":"ContainerStarted","Data":"ed8ed4fa77a7cb37d310f28c1973a5a551fb0698c084e19f9e15ba8f60b2aaf6"} Apr 16 20:43:10.955679 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.955681 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" event={"ID":"d98ff6a5-3a67-4285-94d0-d7659e0c9414","Type":"ContainerStarted","Data":"7694e4e84edf09badb2e05997c583ab85fa134c3c61d819e8437fbc0ee51f429"} Apr 16 20:43:10.955940 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.955712 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:10.973766 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:10.973715 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podStartSLOduration=1.973697997 podStartE2EDuration="1.973697997s" podCreationTimestamp="2026-04-16 20:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:43:10.972083731 +0000 UTC m=+1888.147152897" watchObservedRunningTime="2026-04-16 20:43:10.973697997 +0000 UTC m=+1888.148767164" Apr 16 20:43:16.963823 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:16.963792 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:43:43.878181 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.878147 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8"] Apr 16 20:43:43.882970 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.882948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:43.885135 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.885101 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-ee4eb-kube-rbac-proxy-sar-config\"" Apr 16 20:43:43.885249 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.885146 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-ee4eb-serving-cert\"" Apr 16 20:43:43.887867 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.887844 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8"] Apr 16 20:43:43.893744 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.893724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eab8b18-3311-4e61-9553-488bbd940cf7-openshift-service-ca-bundle\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:43.893842 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.893770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:43.994302 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.994265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eab8b18-3311-4e61-9553-488bbd940cf7-openshift-service-ca-bundle\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:43.994466 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.994330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:43.994466 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:43:43.994436 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-ee4eb-serving-cert: secret "splitter-graph-ee4eb-serving-cert" not found Apr 16 20:43:43.994583 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:43:43.994504 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls podName:0eab8b18-3311-4e61-9553-488bbd940cf7 nodeName:}" failed. No retries permitted until 2026-04-16 20:43:44.494489347 +0000 UTC m=+1921.669558493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls") pod "splitter-graph-ee4eb-6c7dd7d556-krjp8" (UID: "0eab8b18-3311-4e61-9553-488bbd940cf7") : secret "splitter-graph-ee4eb-serving-cert" not found Apr 16 20:43:43.994898 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:43.994878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eab8b18-3311-4e61-9553-488bbd940cf7-openshift-service-ca-bundle\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:44.498743 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:44.498707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:44.500983 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:44.500952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls\") pod \"splitter-graph-ee4eb-6c7dd7d556-krjp8\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:44.792929 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:44.792824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:44.908406 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:44.908381 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8"] Apr 16 20:43:44.910723 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:43:44.910692 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eab8b18_3311_4e61_9553_488bbd940cf7.slice/crio-e4d48cc31a843b02bcd9806619e5cc5eb655815724e0a56424b1506d693a56db WatchSource:0}: Error finding container e4d48cc31a843b02bcd9806619e5cc5eb655815724e0a56424b1506d693a56db: Status 404 returned error can't find the container with id e4d48cc31a843b02bcd9806619e5cc5eb655815724e0a56424b1506d693a56db Apr 16 20:43:45.049640 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:45.049550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" event={"ID":"0eab8b18-3311-4e61-9553-488bbd940cf7","Type":"ContainerStarted","Data":"fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91"} Apr 16 20:43:45.049640 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:45.049598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" event={"ID":"0eab8b18-3311-4e61-9553-488bbd940cf7","Type":"ContainerStarted","Data":"e4d48cc31a843b02bcd9806619e5cc5eb655815724e0a56424b1506d693a56db"} Apr 16 20:43:45.049812 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:45.049693 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:43:45.066889 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:45.066835 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podStartSLOduration=2.066821699 podStartE2EDuration="2.066821699s" podCreationTimestamp="2026-04-16 20:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:43:45.06603566 +0000 UTC m=+1922.241104830" watchObservedRunningTime="2026-04-16 20:43:45.066821699 +0000 UTC m=+1922.241890868" Apr 16 20:43:51.060268 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:43:51.060236 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:51:58.605047 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:51:58.605014 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8"] Apr 16 20:51:58.607794 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:51:58.605257 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" containerID="cri-o://fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91" gracePeriod=30 Apr 16 20:52:01.059087 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:01.059045 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:52:06.059828 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:06.059739 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:52:11.058823 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:11.058784 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:52:11.059226 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:11.058893 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:52:16.059692 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:16.059652 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:52:21.059767 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:21.059725 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:52:26.059301 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:26.059262 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:52:28.739946 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.739918 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:52:28.784283 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.784255 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls\") pod \"0eab8b18-3311-4e61-9553-488bbd940cf7\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " Apr 16 20:52:28.784466 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.784304 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eab8b18-3311-4e61-9553-488bbd940cf7-openshift-service-ca-bundle\") pod \"0eab8b18-3311-4e61-9553-488bbd940cf7\" (UID: \"0eab8b18-3311-4e61-9553-488bbd940cf7\") " Apr 16 20:52:28.784650 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.784628 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eab8b18-3311-4e61-9553-488bbd940cf7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0eab8b18-3311-4e61-9553-488bbd940cf7" (UID: "0eab8b18-3311-4e61-9553-488bbd940cf7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:52:28.786291 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.786271 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0eab8b18-3311-4e61-9553-488bbd940cf7" (UID: "0eab8b18-3311-4e61-9553-488bbd940cf7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:52:28.885918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.885835 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eab8b18-3311-4e61-9553-488bbd940cf7-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:52:28.885918 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:28.885867 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eab8b18-3311-4e61-9553-488bbd940cf7-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:52:29.513024 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.512988 2574 generic.go:358] "Generic (PLEG): container finished" podID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerID="fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91" exitCode=0 Apr 16 20:52:29.513223 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.513051 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" Apr 16 20:52:29.513223 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.513069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" event={"ID":"0eab8b18-3311-4e61-9553-488bbd940cf7","Type":"ContainerDied","Data":"fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91"} Apr 16 20:52:29.513223 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.513123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8" event={"ID":"0eab8b18-3311-4e61-9553-488bbd940cf7","Type":"ContainerDied","Data":"e4d48cc31a843b02bcd9806619e5cc5eb655815724e0a56424b1506d693a56db"} Apr 16 20:52:29.513223 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.513140 2574 scope.go:117] "RemoveContainer" containerID="fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91" Apr 16 20:52:29.520812 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.520796 2574 scope.go:117] "RemoveContainer" containerID="fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91" Apr 16 20:52:29.521059 ip-10-0-134-158 kubenswrapper[2574]: E0416 20:52:29.521034 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91\": container with ID starting with fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91 not found: ID does not exist" containerID="fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91" Apr 16 20:52:29.521134 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.521060 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91"} err="failed to get container status \"fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91\": rpc error: code = NotFound desc = could not find container \"fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91\": container with ID starting with fca770997e061191cd1cc990a7f3e2d2eb4d2c61cda51d740eb6572476a04a91 not found: ID does not exist" Apr 16 20:52:29.527304 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.527282 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8"] Apr 16 20:52:29.530836 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:29.530816 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ee4eb-6c7dd7d556-krjp8"] Apr 16 20:52:31.397167 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:52:31.397135 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" path="/var/lib/kubelet/pods/0eab8b18-3311-4e61-9553-488bbd940cf7/volumes" Apr 16 20:59:28.736474 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:28.736440 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj"] Apr 16 20:59:28.739169 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:28.736666 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" containerID="cri-o://ed8ed4fa77a7cb37d310f28c1973a5a551fb0698c084e19f9e15ba8f60b2aaf6" gracePeriod=30 Apr 16 20:59:30.411008 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.410972 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b5lv7/must-gather-9xxxf"] Apr 16 20:59:30.411373 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.411274 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" Apr 16 20:59:30.411373 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.411286 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" Apr 16 20:59:30.411373 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.411345 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0eab8b18-3311-4e61-9553-488bbd940cf7" containerName="splitter-graph-ee4eb" Apr 16 20:59:30.414265 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.414248 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.416662 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.416645 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5lv7\"/\"openshift-service-ca.crt\"" Apr 16 20:59:30.416754 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.416736 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b5lv7\"/\"default-dockercfg-wx5w4\"" Apr 16 20:59:30.417508 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.417493 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5lv7\"/\"kube-root-ca.crt\"" Apr 16 20:59:30.432825 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.432798 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5lv7/must-gather-9xxxf"] Apr 16 20:59:30.526937 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.526891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-must-gather-output\") pod \"must-gather-9xxxf\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.527164 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.527041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vkb\" (UniqueName: \"kubernetes.io/projected/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-kube-api-access-x6vkb\") pod \"must-gather-9xxxf\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.627715 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.627673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vkb\" (UniqueName: \"kubernetes.io/projected/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-kube-api-access-x6vkb\") pod \"must-gather-9xxxf\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.627715 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.627721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-must-gather-output\") pod \"must-gather-9xxxf\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.628031 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.628014 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-must-gather-output\") pod \"must-gather-9xxxf\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.636530 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.636502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vkb\" (UniqueName: \"kubernetes.io/projected/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-kube-api-access-x6vkb\") pod \"must-gather-9xxxf\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.731143 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.731044 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:30.850490 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.850465 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5lv7/must-gather-9xxxf"] Apr 16 20:59:30.852922 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:59:30.852888 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18c9e94_06ed_4317_84c0_5fe3fa8ca5ad.slice/crio-6b80fb21d8f984db41d1b8f4e6ee2a2bcb21a249083b8fa1df9e7a3ba483bb90 WatchSource:0}: Error finding container 6b80fb21d8f984db41d1b8f4e6ee2a2bcb21a249083b8fa1df9e7a3ba483bb90: Status 404 returned error can't find the container with id 6b80fb21d8f984db41d1b8f4e6ee2a2bcb21a249083b8fa1df9e7a3ba483bb90 Apr 16 20:59:30.854513 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:30.854499 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:59:31.685700 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:31.685644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" event={"ID":"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad","Type":"ContainerStarted","Data":"6b80fb21d8f984db41d1b8f4e6ee2a2bcb21a249083b8fa1df9e7a3ba483bb90"} Apr 16 20:59:31.964293 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:31.964024 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:59:35.699952 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:35.699918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" event={"ID":"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad","Type":"ContainerStarted","Data":"65f02825b5275eaa4397c78b7dc3fee547967bd0ef08dcfe25bbb1a087167edd"} Apr 16 20:59:35.699952 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:35.699956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" event={"ID":"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad","Type":"ContainerStarted","Data":"07c39f81442c92b32408699f28df8eeeb96cf18a0b4ed350a999f331847565e4"} Apr 16 20:59:36.962963 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:36.962912 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:59:41.963660 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:41.963606 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:59:41.964308 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:41.963747 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:59:41.980132 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:41.980045 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" podStartSLOduration=7.995577303 podStartE2EDuration="11.980024753s" podCreationTimestamp="2026-04-16 20:59:30 +0000 UTC" firstStartedPulling="2026-04-16 20:59:30.854623253 +0000 UTC m=+2868.029692399" lastFinishedPulling="2026-04-16 20:59:34.839070703 +0000 UTC m=+2872.014139849" observedRunningTime="2026-04-16 20:59:35.727553063 +0000 UTC m=+2872.902622229" watchObservedRunningTime="2026-04-16 20:59:41.980024753 +0000 UTC m=+2879.155093922" Apr 16 20:59:43.660337 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:43.660308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:44.367552 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:44.367517 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:45.092465 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:45.092435 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:45.789568 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:45.789539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:46.495761 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:46.495729 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:46.962559 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:46.962470 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:59:47.199013 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:47.198984 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:47.911092 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:47.911058 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:48.595886 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:48.595856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:49.286945 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:49.286916 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:49.968774 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:49.968742 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:50.666059 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:50.666014 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:51.362357 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:51.362316 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a03d4-7967858c6b-gzxdj_d98ff6a5-3a67-4285-94d0-d7659e0c9414/switch-graph-a03d4/0.log" Apr 16 20:59:51.962810 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:51.962768 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:59:53.756423 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:53.756389 2574 generic.go:358] "Generic (PLEG): container finished" podID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerID="07c39f81442c92b32408699f28df8eeeb96cf18a0b4ed350a999f331847565e4" exitCode=0 Apr 16 20:59:53.756834 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:53.756449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" event={"ID":"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad","Type":"ContainerDied","Data":"07c39f81442c92b32408699f28df8eeeb96cf18a0b4ed350a999f331847565e4"} Apr 16 20:59:53.756834 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:53.756751 2574 scope.go:117] "RemoveContainer" containerID="07c39f81442c92b32408699f28df8eeeb96cf18a0b4ed350a999f331847565e4" Apr 16 20:59:54.167640 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.167550 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b5lv7_must-gather-9xxxf_b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad/gather/0.log" Apr 16 20:59:54.748338 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.748302 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcf77/must-gather-r47ch"] Apr 16 20:59:54.752031 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.752009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:54.754235 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.754213 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wcf77\"/\"default-dockercfg-rx7m6\"" Apr 16 20:59:54.754359 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.754262 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wcf77\"/\"kube-root-ca.crt\"" Apr 16 20:59:54.755151 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.755126 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wcf77\"/\"openshift-service-ca.crt\"" Apr 16 20:59:54.760947 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.760923 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/must-gather-r47ch"] Apr 16 20:59:54.850316 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.850281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpc4\" (UniqueName: \"kubernetes.io/projected/2a4ba0e3-11fa-45c7-a231-2f987a60944c-kube-api-access-kmpc4\") pod \"must-gather-r47ch\" (UID: \"2a4ba0e3-11fa-45c7-a231-2f987a60944c\") " pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:54.850517 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.850329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4ba0e3-11fa-45c7-a231-2f987a60944c-must-gather-output\") pod \"must-gather-r47ch\" (UID: \"2a4ba0e3-11fa-45c7-a231-2f987a60944c\") " pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:54.951684 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.951641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4ba0e3-11fa-45c7-a231-2f987a60944c-must-gather-output\") pod \"must-gather-r47ch\" (UID: \"2a4ba0e3-11fa-45c7-a231-2f987a60944c\") " pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:54.951847 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.951729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpc4\" (UniqueName: \"kubernetes.io/projected/2a4ba0e3-11fa-45c7-a231-2f987a60944c-kube-api-access-kmpc4\") pod \"must-gather-r47ch\" (UID: \"2a4ba0e3-11fa-45c7-a231-2f987a60944c\") " pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:54.952013 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.951986 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4ba0e3-11fa-45c7-a231-2f987a60944c-must-gather-output\") pod \"must-gather-r47ch\" (UID: \"2a4ba0e3-11fa-45c7-a231-2f987a60944c\") " pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:54.959870 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:54.959837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpc4\" (UniqueName: \"kubernetes.io/projected/2a4ba0e3-11fa-45c7-a231-2f987a60944c-kube-api-access-kmpc4\") pod \"must-gather-r47ch\" (UID: \"2a4ba0e3-11fa-45c7-a231-2f987a60944c\") " pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:55.061378 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:55.061292 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/must-gather-r47ch" Apr 16 20:59:55.179523 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:55.179498 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/must-gather-r47ch"] Apr 16 20:59:55.183918 ip-10-0-134-158 kubenswrapper[2574]: W0416 20:59:55.183714 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a4ba0e3_11fa_45c7_a231_2f987a60944c.slice/crio-b78bc51f5c0ede37b51f49123229e66c3199cbd1c39ceb173ec05ab98dfc66ef WatchSource:0}: Error finding container b78bc51f5c0ede37b51f49123229e66c3199cbd1c39ceb173ec05ab98dfc66ef: Status 404 returned error can't find the container with id b78bc51f5c0ede37b51f49123229e66c3199cbd1c39ceb173ec05ab98dfc66ef Apr 16 20:59:55.763245 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:55.763213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/must-gather-r47ch" event={"ID":"2a4ba0e3-11fa-45c7-a231-2f987a60944c","Type":"ContainerStarted","Data":"b78bc51f5c0ede37b51f49123229e66c3199cbd1c39ceb173ec05ab98dfc66ef"} Apr 16 20:59:56.768424 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:56.768393 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/must-gather-r47ch" event={"ID":"2a4ba0e3-11fa-45c7-a231-2f987a60944c","Type":"ContainerStarted","Data":"20621b3affc9dfe3a79db686e681e92dec433e5eaded77abc762264b1eb3ef84"} Apr 16 20:59:56.768424 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:56.768430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/must-gather-r47ch" event={"ID":"2a4ba0e3-11fa-45c7-a231-2f987a60944c","Type":"ContainerStarted","Data":"a4d9b9f2e9528c0924e2c1319ead1d931cacfd0da13520926773a4f8a15d12b1"} Apr 16 20:59:56.784276 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:56.784222 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wcf77/must-gather-r47ch" podStartSLOduration=1.9176777999999999 podStartE2EDuration="2.784205104s" podCreationTimestamp="2026-04-16 20:59:54 +0000 UTC" firstStartedPulling="2026-04-16 20:59:55.185404159 +0000 UTC m=+2892.360473305" lastFinishedPulling="2026-04-16 20:59:56.051931462 +0000 UTC m=+2893.227000609" observedRunningTime="2026-04-16 20:59:56.782748898 +0000 UTC m=+2893.957818078" watchObservedRunningTime="2026-04-16 20:59:56.784205104 +0000 UTC m=+2893.959274270" Apr 16 20:59:56.963210 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:56.963169 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:59:57.441673 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:57.441576 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tjktw_9a8f93c5-d267-47b5-a685-fc5bd8269d88/global-pull-secret-syncer/0.log" Apr 16 20:59:57.541007 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:57.540976 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-m9rqh_09d0c13a-a560-4064-a9eb-9f9a9df65df6/konnectivity-agent/0.log" Apr 16 20:59:57.650286 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:57.650250 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-158.ec2.internal_51b4526a209496dd9377d4f989eaa37c/haproxy/0.log" Apr 16 20:59:58.780361 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:58.780185 2574 generic.go:358] "Generic (PLEG): container finished" podID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerID="ed8ed4fa77a7cb37d310f28c1973a5a551fb0698c084e19f9e15ba8f60b2aaf6" exitCode=0 Apr 16 20:59:58.780361 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:58.780300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" event={"ID":"d98ff6a5-3a67-4285-94d0-d7659e0c9414","Type":"ContainerDied","Data":"ed8ed4fa77a7cb37d310f28c1973a5a551fb0698c084e19f9e15ba8f60b2aaf6"} Apr 16 20:59:59.539218 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.538804 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:59:59.588931 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.588163 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b5lv7/must-gather-9xxxf"] Apr 16 20:59:59.588931 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.588436 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="copy" containerID="cri-o://65f02825b5275eaa4397c78b7dc3fee547967bd0ef08dcfe25bbb1a087167edd" gracePeriod=2 Apr 16 20:59:59.590804 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.590347 2574 status_manager.go:895] "Failed to get status for pod" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" err="pods \"must-gather-9xxxf\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b5lv7\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 16 20:59:59.590804 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.590748 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b5lv7/must-gather-9xxxf"] Apr 16 20:59:59.706132 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.704311 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d98ff6a5-3a67-4285-94d0-d7659e0c9414-openshift-service-ca-bundle\") pod \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " Apr 16 20:59:59.706132 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.704391 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls\") pod \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\" (UID: \"d98ff6a5-3a67-4285-94d0-d7659e0c9414\") " Apr 16 20:59:59.706132 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.705159 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d98ff6a5-3a67-4285-94d0-d7659e0c9414-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d98ff6a5-3a67-4285-94d0-d7659e0c9414" (UID: "d98ff6a5-3a67-4285-94d0-d7659e0c9414"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:59:59.717218 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.717142 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d98ff6a5-3a67-4285-94d0-d7659e0c9414" (UID: "d98ff6a5-3a67-4285-94d0-d7659e0c9414"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:59:59.789257 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.786656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" event={"ID":"d98ff6a5-3a67-4285-94d0-d7659e0c9414","Type":"ContainerDied","Data":"7694e4e84edf09badb2e05997c583ab85fa134c3c61d819e8437fbc0ee51f429"} Apr 16 20:59:59.789257 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.786728 2574 scope.go:117] "RemoveContainer" containerID="ed8ed4fa77a7cb37d310f28c1973a5a551fb0698c084e19f9e15ba8f60b2aaf6" Apr 16 20:59:59.789257 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.786877 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj" Apr 16 20:59:59.800345 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.800308 2574 status_manager.go:895] "Failed to get status for pod" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" err="pods \"must-gather-9xxxf\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b5lv7\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 16 20:59:59.806142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.805749 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d98ff6a5-3a67-4285-94d0-d7659e0c9414-openshift-service-ca-bundle\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:59:59.806142 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.805776 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d98ff6a5-3a67-4285-94d0-d7659e0c9414-proxy-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 20:59:59.811532 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.809887 2574 status_manager.go:895] "Failed to get status for pod" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" err="pods \"must-gather-9xxxf\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b5lv7\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 16 20:59:59.812829 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.812790 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b5lv7_must-gather-9xxxf_b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad/copy/0.log" Apr 16 20:59:59.814579 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.813345 2574 generic.go:358] "Generic (PLEG): container finished" podID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerID="65f02825b5275eaa4397c78b7dc3fee547967bd0ef08dcfe25bbb1a087167edd" exitCode=143 Apr 16 20:59:59.826272 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.824531 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj"] Apr 16 20:59:59.827671 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.827612 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a03d4-7967858c6b-gzxdj"] Apr 16 20:59:59.955129 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.954278 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b5lv7_must-gather-9xxxf_b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad/copy/0.log" Apr 16 20:59:59.955129 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.954718 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 20:59:59.960128 ip-10-0-134-158 kubenswrapper[2574]: I0416 20:59:59.959495 2574 status_manager.go:895] "Failed to get status for pod" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" err="pods \"must-gather-9xxxf\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b5lv7\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 16 21:00:00.007155 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.007116 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-must-gather-output\") pod \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " Apr 16 21:00:00.007334 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.007222 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6vkb\" (UniqueName: \"kubernetes.io/projected/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-kube-api-access-x6vkb\") pod \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\" (UID: \"b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad\") " Apr 16 21:00:00.011127 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.009057 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" (UID: "b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:00:00.014999 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.013584 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-kube-api-access-x6vkb" (OuterVolumeSpecName: "kube-api-access-x6vkb") pod "b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" (UID: "b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad"). InnerVolumeSpecName "kube-api-access-x6vkb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:00:00.108393 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.108305 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6vkb\" (UniqueName: \"kubernetes.io/projected/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-kube-api-access-x6vkb\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 21:00:00.108393 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.108345 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad-must-gather-output\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 16 21:00:00.819494 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.819417 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b5lv7_must-gather-9xxxf_b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad/copy/0.log" Apr 16 21:00:00.824145 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.820450 2574 scope.go:117] "RemoveContainer" containerID="65f02825b5275eaa4397c78b7dc3fee547967bd0ef08dcfe25bbb1a087167edd" Apr 16 21:00:00.824145 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.820564 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" Apr 16 21:00:00.834228 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.833435 2574 status_manager.go:895] "Failed to get status for pod" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" err="pods \"must-gather-9xxxf\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b5lv7\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 16 21:00:00.835771 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.835731 2574 status_manager.go:895] "Failed to get status for pod" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" pod="openshift-must-gather-b5lv7/must-gather-9xxxf" err="pods \"must-gather-9xxxf\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-b5lv7\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 16 21:00:00.841052 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:00.840298 2574 scope.go:117] "RemoveContainer" containerID="07c39f81442c92b32408699f28df8eeeb96cf18a0b4ed350a999f331847565e4" Apr 16 21:00:01.399702 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.399666 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" path="/var/lib/kubelet/pods/b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad/volumes" Apr 16 21:00:01.400203 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.400182 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" path="/var/lib/kubelet/pods/d98ff6a5-3a67-4285-94d0-d7659e0c9414/volumes" Apr 16 21:00:01.595734 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.595693 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-766f745c7-bkbk7_b2870419-823d-4e89-99b0-90e1ff3cba57/metrics-server/0.log" Apr 16 21:00:01.621954 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.621921 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8ttxz_814dcf6b-d546-4e89-ba47-3f454a8db7ad/monitoring-plugin/0.log" Apr 16 21:00:01.804930 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.804900 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kmhvt_5b579133-c04b-4dcc-beca-86c21e3982a1/node-exporter/0.log" Apr 16 21:00:01.826176 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.826146 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kmhvt_5b579133-c04b-4dcc-beca-86c21e3982a1/kube-rbac-proxy/0.log" Apr 16 21:00:01.847595 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.847555 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kmhvt_5b579133-c04b-4dcc-beca-86c21e3982a1/init-textfile/0.log" Apr 16 21:00:01.973901 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.973870 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/prometheus/0.log" Apr 16 21:00:01.992742 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:01.992716 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/config-reloader/0.log" Apr 16 21:00:02.018694 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.018667 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/thanos-sidecar/0.log" Apr 16 21:00:02.044422 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.044392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/kube-rbac-proxy-web/0.log" Apr 16 21:00:02.065242 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.065167 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/kube-rbac-proxy/0.log" Apr 16 21:00:02.092041 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.092008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/kube-rbac-proxy-thanos/0.log" Apr 16 21:00:02.124659 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.124625 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cfd231ea-71e7-4ccd-a622-53f9c9762097/init-config-reloader/0.log" Apr 16 21:00:02.168690 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.168660 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-c5znm_a6a0faef-f097-4252-897d-03c4dc8946b1/prometheus-operator/0.log" Apr 16 21:00:02.196837 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.196812 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-c5znm_a6a0faef-f097-4252-897d-03c4dc8946b1/kube-rbac-proxy/0.log" Apr 16 21:00:02.366291 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.366253 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ffb987485-vcq2w_723b9f65-5d9e-43f5-86f3-1e0655fdc449/thanos-query/0.log" Apr 16 21:00:02.389441 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.389411 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ffb987485-vcq2w_723b9f65-5d9e-43f5-86f3-1e0655fdc449/kube-rbac-proxy-web/0.log" Apr 16 21:00:02.415219 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.415188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ffb987485-vcq2w_723b9f65-5d9e-43f5-86f3-1e0655fdc449/kube-rbac-proxy/0.log" Apr 16 21:00:02.440386 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.440273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ffb987485-vcq2w_723b9f65-5d9e-43f5-86f3-1e0655fdc449/prom-label-proxy/0.log" Apr 16 21:00:02.460588 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.460553 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ffb987485-vcq2w_723b9f65-5d9e-43f5-86f3-1e0655fdc449/kube-rbac-proxy-rules/0.log" Apr 16 21:00:02.487076 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:02.487046 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ffb987485-vcq2w_723b9f65-5d9e-43f5-86f3-1e0655fdc449/kube-rbac-proxy-metrics/0.log" Apr 16 21:00:05.155136 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.155092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h7pxk_1897c5f1-ae77-47b3-96cf-15366561bfa3/dns/0.log" Apr 16 21:00:05.174298 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.174269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h7pxk_1897c5f1-ae77-47b3-96cf-15366561bfa3/kube-rbac-proxy/0.log" Apr 16 21:00:05.274836 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.274807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s8ghz_f5a78283-8ec7-49a3-9423-1ae8f58f10ec/dns-node-resolver/0.log" Apr 16 21:00:05.707365 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.707336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-djk7r_5f1ca9fc-d31a-4c74-aa4f-c265b81f18f6/node-ca/0.log" Apr 16 21:00:05.714848 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.714821 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx"] Apr 16 21:00:05.715119 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715097 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" Apr 16 21:00:05.715166 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715122 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" Apr 16 21:00:05.715166 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715136 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="copy" Apr 16 21:00:05.715166 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715142 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="copy" Apr 16 21:00:05.715166 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715158 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="gather" Apr 16 21:00:05.715166 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715164 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="gather" Apr 16 21:00:05.715310 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715211 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d98ff6a5-3a67-4285-94d0-d7659e0c9414" containerName="switch-graph-a03d4" Apr 16 21:00:05.715310 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715225 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="copy" Apr 16 21:00:05.715310 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.715234 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b18c9e94-06ed-4317-84c0-5fe3fa8ca5ad" containerName="gather" Apr 16 21:00:05.720218 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.720195 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.727814 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.727791 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx"] Apr 16 21:00:05.754450 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.754420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-lib-modules\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.754620 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.754456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzml5\" (UniqueName: \"kubernetes.io/projected/e6feaa8b-de68-4566-b665-509288ec63a8-kube-api-access-mzml5\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.754620 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.754551 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-sys\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.754620 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.754595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-proc\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.754738 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.754635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-podres\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855033 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-podres\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855207 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-lib-modules\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855207 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzml5\" (UniqueName: \"kubernetes.io/projected/e6feaa8b-de68-4566-b665-509288ec63a8-kube-api-access-mzml5\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855207 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-sys\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855207 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-proc\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855207 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-lib-modules\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855207 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855176 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-podres\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855431 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-proc\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.855431 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.855236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6feaa8b-de68-4566-b665-509288ec63a8-sys\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:05.862698 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:05.862672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzml5\" (UniqueName: \"kubernetes.io/projected/e6feaa8b-de68-4566-b665-509288ec63a8-kube-api-access-mzml5\") pod \"perf-node-gather-daemonset-2njmx\" (UID: \"e6feaa8b-de68-4566-b665-509288ec63a8\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:06.032536 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:06.032497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:06.177973 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:06.177934 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx"] Apr 16 21:00:06.820469 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:06.820442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vq5vz_2380cbc7-d39a-4681-ac0d-6e245781eba4/serve-healthcheck-canary/0.log" Apr 16 21:00:06.844913 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:06.844845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" event={"ID":"e6feaa8b-de68-4566-b665-509288ec63a8","Type":"ContainerStarted","Data":"c69d832e370bfe1cd9c38432b5be21d7732f2a07f6b935856d63344d6b5f49fa"} Apr 16 21:00:06.844913 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:06.844885 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" event={"ID":"e6feaa8b-de68-4566-b665-509288ec63a8","Type":"ContainerStarted","Data":"9ee5f24ff4ce22eb6fb5aeb6cd2bb2bc48eb7e5588aed458c7b6e1e44ace4bf4"} Apr 16 21:00:06.845330 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:06.845282 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:07.178835 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:07.178740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bd6r6_913f758d-f60f-4832-a358-f15c2a5f2709/kube-rbac-proxy/0.log" Apr 16 21:00:07.198714 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:07.198679 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bd6r6_913f758d-f60f-4832-a358-f15c2a5f2709/exporter/0.log" Apr 16 21:00:07.220150 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:07.220121 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bd6r6_913f758d-f60f-4832-a358-f15c2a5f2709/extractor/0.log" Apr 16 21:00:09.341337 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:09.341304 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-dkl42_3ba76c1a-d074-4aef-9798-9cd3ce4a7826/server/0.log" Apr 16 21:00:09.811748 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:09.811713 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gr82t_5cdaa442-7025-41c6-989e-5d1f82822423/manager/0.log" Apr 16 21:00:09.832918 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:09.832888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jvcft_d4c6cc98-1c17-46e0-ae8e-b749d4c1775c/s3-init/0.log" Apr 16 21:00:09.861496 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:09.861473 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-7kgxr_a2e072ed-f2ae-4d03-b630-d108a0c477c8/seaweedfs/0.log" Apr 16 21:00:12.858383 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:12.858353 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" Apr 16 21:00:12.874251 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:12.874191 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-2njmx" podStartSLOduration=7.874173126 podStartE2EDuration="7.874173126s" podCreationTimestamp="2026-04-16 21:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:06.862770483 +0000 UTC m=+2904.037839651" watchObservedRunningTime="2026-04-16 21:00:12.874173126 +0000 UTC m=+2910.049242294" Apr 16 21:00:14.524654 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:14.524620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-554gl_95a10881-801c-4945-a254-7cb7bf980128/kube-multus/0.log" Apr 16 21:00:15.154374 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.154350 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/kube-multus-additional-cni-plugins/0.log" Apr 16 21:00:15.198256 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.198189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/egress-router-binary-copy/0.log" Apr 16 21:00:15.251339 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.251303 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/cni-plugins/0.log" Apr 16 21:00:15.321773 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.321747 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/bond-cni-plugin/0.log" Apr 16 21:00:15.369363 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.369331 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/routeoverride-cni/0.log" Apr 16 21:00:15.404834 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.404801 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/whereabouts-cni-bincopy/0.log" Apr 16 21:00:15.446040 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.446016 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zj8xr_174b60ef-32a3-4bd0-a527-a01fa61b76bb/whereabouts-cni/0.log" Apr 16 21:00:15.603554 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.603522 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gmj69_604b143f-56b9-4ff2-a025-f1f904de0066/network-metrics-daemon/0.log" Apr 16 21:00:15.641495 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:15.641466 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gmj69_604b143f-56b9-4ff2-a025-f1f904de0066/kube-rbac-proxy/0.log" Apr 16 21:00:16.947144 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:16.947092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/ovn-controller/0.log" Apr 16 21:00:16.992731 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:16.992673 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/ovn-acl-logging/0.log" Apr 16 21:00:17.013842 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:17.013804 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/kube-rbac-proxy-node/0.log" Apr 16 21:00:17.037875 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:17.037831 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:00:17.060327 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:17.060249 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/northd/0.log" Apr 16 21:00:17.081869 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:17.081840 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/nbdb/0.log" Apr 16 21:00:17.106515 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:17.106488 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/sbdb/0.log" Apr 16 21:00:17.309018 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:17.308984 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh5vc_45d8d50c-84a1-4de5-af66-9216392f6268/ovnkube-controller/0.log" Apr 16 21:00:18.276128 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:18.276085 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nhjrd_3251e838-9ac0-43bc-88bb-3f2002d4ad60/network-check-target-container/0.log" Apr 16 21:00:19.197249 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:19.197219 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qx265_469ea03f-422f-4251-bd51-04361b2e17fc/iptables-alerter/0.log" Apr 16 21:00:19.882496 ip-10-0-134-158 kubenswrapper[2574]: I0416 21:00:19.882460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7xl9r_bd2529ae-05ec-4e4d-be54-a85e19f1b7b7/tuned/0.log"