Apr 16 16:21:37.726855 ip-10-0-128-64 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:21:37.726868 ip-10-0-128-64 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:21:37.726877 ip-10-0-128-64 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:21:37.727394 ip-10-0-128-64 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:21:47.865930 ip-10-0-128-64 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:21:47.865947 ip-10-0-128-64 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 535efc4c44604f6e99060519f4384ccf -- Apr 16 16:24:04.157220 ip-10-0-128-64 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:24:04.604520 ip-10-0-128-64 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:24:04.604520 ip-10-0-128-64 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:24:04.604520 ip-10-0-128-64 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:24:04.604520 ip-10-0-128-64 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:24:04.604520 ip-10-0-128-64 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:24:04.608024 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.607935 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:24:04.611337 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611316 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:24:04.611337 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611333 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:24:04.611337 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611338 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:24:04.611337 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611341 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:24:04.611337 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611344 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611347 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611351 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611354 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611356 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611359 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611362 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611364 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611367 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611370 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611373 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611376 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611378 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611381 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611384 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611386 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611389 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611391 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611394 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611397 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:24:04.611524 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611399 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611402 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611410 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611413 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611415 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611418 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611420 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611423 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611425 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611428 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611431 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611433 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611436 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611439 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611442 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611445 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611448 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611450 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611452 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611455 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:24:04.612060 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611458 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611461 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611465 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611469 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611472 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611475 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611477 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611480 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611486 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611491 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611496 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611499 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611502 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611505 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611507 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611510 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611512 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611515 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611518 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:24:04.612555 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611520 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611523 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611525 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611528 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611531 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611534 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611537 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611540 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611543 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611545 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611548 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611551 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611553 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611556 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611558 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611563 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611566 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611568 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611571 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611573 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:24:04.613022 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611576 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611580 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.611582 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612828 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612838 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612841 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612845 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612847 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612850 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612853 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612856 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612859 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612862 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612865 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612869 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612871 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612874 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612876 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612880 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612884 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:24:04.613532 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612887 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612890 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612892 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612895 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612897 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612900 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612903 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612907 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612909 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612912 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612915 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612930 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612934 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612937 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612940 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612943 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612947 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612951 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612953 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612956 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:24:04.614028 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612959 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612962 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612965 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612968 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612971 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612974 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612977 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612980 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612983 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612986 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612989 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612993 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612996 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.612999 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613001 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613004 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613006 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613009 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613011 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:24:04.614525 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613014 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613018 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613021 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613023 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613026 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613029 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613031 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613034 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613037 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613040 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613042 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613045 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613047 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613050 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613053 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613055 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613058 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613061 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613064 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613067 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:24:04.615014 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613069 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613072 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613074 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613077 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613080 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613083 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613085 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613088 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613090 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.613093 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613651 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613660 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613667 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613671 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613677 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613680 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613685 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613690 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613693 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613696 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613700 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613703 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:24:04.615506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613706 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613709 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613712 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613715 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613718 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613721 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613724 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613729 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613732 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613735 2578 flags.go:64] FLAG: --config-dir="" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613738 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613742 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613746 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613749 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613753 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613756 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613759 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613763 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613766 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613769 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613772 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613776 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613779 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613782 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613785 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:24:04.616069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613789 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613792 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613796 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613799 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613803 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613806 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613809 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613812 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613815 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613818 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613821 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613824 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613827 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613830 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613833 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613837 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613839 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613842 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613846 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613850 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613853 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613856 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613860 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613863 2578 flags.go:64] FLAG: --help="false" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613866 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-128-64.ec2.internal" Apr 16 16:24:04.616677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613870 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613873 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613876 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613879 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613882 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613885 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613888 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613891 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613894 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613897 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613900 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613903 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613906 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613909 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613912 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613915 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613930 2578 flags.go:64] FLAG: --lock-file="" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613934 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613937 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613940 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613946 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613952 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613955 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:24:04.617328 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613958 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613961 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613965 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613968 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613971 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613976 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613979 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613983 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613987 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613990 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613992 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613995 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.613998 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614002 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614004 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614013 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614016 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614019 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614022 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614025 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614031 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614034 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614037 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614041 2578 flags.go:64] FLAG: --port="10250" Apr 16 16:24:04.617894 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614044 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614047 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0570a9c5a624f51ce" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614050 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614052 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614060 2578 flags.go:64] FLAG: --register-node="true" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614063 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614067 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614071 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614074 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614077 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614080 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614083 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614087 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614090 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614093 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614095 2578 flags.go:64] FLAG: --runonce="false" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614098 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614101 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614104 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614107 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614110 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614113 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614130 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614134 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614138 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614141 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:24:04.618478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614144 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614148 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614151 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614154 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614157 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614163 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614166 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614169 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614174 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614177 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614181 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614184 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614189 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614192 2578 flags.go:64] FLAG: --v="2" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614196 2578 flags.go:64] FLAG: --version="false" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614201 2578 flags.go:64] FLAG: --vmodule="" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614205 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.614208 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614302 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614306 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614310 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614312 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614315 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:24:04.619161 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614318 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614321 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614325 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614328 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614331 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614333 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614336 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614339 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614341 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614344 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614346 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614350 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614353 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614356 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614358 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614361 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614363 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614366 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614368 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614372 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:24:04.619722 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614375 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614379 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614382 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614384 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614387 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614390 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614392 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614395 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614398 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614400 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614403 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614405 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614408 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614410 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614413 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614415 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614417 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614420 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614422 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614425 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:24:04.620243 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614427 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614430 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614432 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614435 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614438 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614440 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614443 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614445 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614449 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614452 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614456 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614460 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614463 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614467 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614469 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614472 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614474 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614477 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614480 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614482 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:24:04.620760 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614485 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614488 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614490 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614493 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614496 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614499 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614502 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614504 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614506 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614509 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614512 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614514 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614517 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614519 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614522 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614524 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614527 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614533 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614536 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:24:04.621268 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614539 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:24:04.621754 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.614541 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:24:04.621754 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.615442 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:24:04.622739 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.622720 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:24:04.622777 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.622741 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:24:04.622805 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622791 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:24:04.622805 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622797 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:24:04.622805 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622801 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:24:04.622805 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622804 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622807 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622810 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622813 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622816 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622820 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622823 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622826 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622828 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622831 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622834 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622836 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622839 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622842 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622844 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622847 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622850 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622852 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622856 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622861 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:24:04.622906 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622864 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622866 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622869 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622872 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622874 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622877 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622880 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622882 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622886 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622888 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622891 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622894 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622896 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622898 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622902 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622905 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622908 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622910 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622913 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:24:04.623436 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622916 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622932 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622937 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622940 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622942 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622945 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622947 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622950 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622953 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622956 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622958 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622961 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622963 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622966 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622969 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622971 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622974 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622977 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622979 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622982 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:24:04.623902 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622984 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622987 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622990 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622993 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622996 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.622998 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623001 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623004 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623007 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623011 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623013 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623016 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623019 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623021 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623024 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623026 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623029 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623032 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623034 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623037 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:24:04.624399 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623040 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623043 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623045 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623048 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.623053 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623158 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623163 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623166 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623169 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623172 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623176 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623179 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623182 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623184 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623187 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:24:04.624981 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623190 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623193 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623195 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623198 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623201 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623205 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623208 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623210 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623213 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623216 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623218 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623221 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623223 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623226 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623229 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623231 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623234 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623236 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623239 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623241 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:24:04.625349 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623244 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623246 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623249 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623251 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623254 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623257 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623259 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623262 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623264 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623267 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623269 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623272 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623275 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623280 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623282 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623286 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623288 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623291 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623294 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:24:04.625852 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623296 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623299 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623301 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623304 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623306 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623309 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623311 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623314 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623317 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623319 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623323 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623327 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623330 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623332 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623335 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623338 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623340 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623343 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623346 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623348 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:24:04.626334 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623351 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623354 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623357 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623359 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623362 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623364 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623367 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623370 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623373 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623375 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623378 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623380 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623383 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623385 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623388 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623390 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:24:04.626816 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:04.623393 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:24:04.627228 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.623398 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:24:04.627228 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.624206 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:24:04.627686 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.627670 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:24:04.628538 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.628526 2578 server.go:1019] "Starting client certificate rotation" Apr 16 16:24:04.628633 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.628619 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:24:04.629411 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.629398 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:24:04.655807 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.655786 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:24:04.657646 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.657624 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:24:04.675632 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.675603 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:24:04.681049 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.681030 2578 log.go:25] "Validated CRI v1 image API" Apr 16 16:24:04.682280 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.682261 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:24:04.686534 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.686515 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 be057d40-9e2f-4b0e-992e-027fa079f8ea:/dev/nvme0n1p4 be67654d-fbfd-44f6-a154-84be8766bc17:/dev/nvme0n1p3] Apr 16 16:24:04.686588 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.686534 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:24:04.689158 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.689140 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:24:04.691672 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.691556 2578 manager.go:217] Machine: {Timestamp:2026-04-16 16:24:04.690488097 +0000 UTC m=+0.413032479 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098930 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c53ff642d1a5ed9134b44c9b6beae SystemUUID:ec2c53ff-642d-1a5e-d913-4b44c9b6beae BootID:535efc4c-4460-4f6e-9906-0519f4384ccf Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:38:c6:bb:47 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:38:c6:bb:47 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:25:4c:09:b5:e0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:24:04.691672 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.691668 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:24:04.691781 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.691754 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:24:04.693337 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.693315 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:24:04.693499 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.693339 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-64.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:24:04.693548 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.693511 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:24:04.693548 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.693520 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:24:04.693548 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.693534 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:24:04.694145 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.694135 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:24:04.694961 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.694952 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:24:04.695061 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.695053 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:24:04.697165 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.697154 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:24:04.697206 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.697171 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:24:04.697206 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.697189 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:24:04.697263 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.697212 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:24:04.697263 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.697225 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:24:04.698269 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.698255 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:24:04.698316 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.698281 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:24:04.701385 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.701367 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:24:04.703046 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.703032 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:24:04.704263 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704250 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704268 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704274 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704280 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704285 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704291 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704297 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704303 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704313 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704319 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:24:04.704329 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704331 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:24:04.704732 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.704341 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:24:04.705204 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.705192 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:24:04.705204 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.705203 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:24:04.708758 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.708739 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:24:04.708837 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.708802 2578 server.go:1295] "Started kubelet" Apr 16 16:24:04.708947 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.708900 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:24:04.709107 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.709056 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:24:04.709152 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.709136 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:24:04.709842 ip-10-0-128-64 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:24:04.710226 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.710203 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:24:04.710319 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.710267 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:24:04.714653 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.714629 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-64.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:24:04.714766 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.714685 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-64.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:24:04.714859 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.714838 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:24:04.718639 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.717517 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-64.ec2.internal.18a6e2f4f2453f39 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-64.ec2.internal,UID:ip-10-0-128-64.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-64.ec2.internal,},FirstTimestamp:2026-04-16 16:24:04.708761401 +0000 UTC m=+0.431305800,LastTimestamp:2026-04-16 16:24:04.708761401 +0000 UTC m=+0.431305800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-64.ec2.internal,}" Apr 16 16:24:04.720108 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.720085 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:24:04.720680 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.720652 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:24:04.721404 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721384 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:24:04.721404 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721387 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:24:04.721553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721413 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:24:04.721553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721392 2578 factory.go:55] Registering systemd factory Apr 16 16:24:04.721553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721504 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:24:04.721553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721545 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:24:04.721553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721554 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:24:04.721802 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.721785 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:04.721802 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721790 2578 factory.go:153] Registering CRI-O factory Apr 16 16:24:04.721900 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721813 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 16:24:04.721900 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721871 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:24:04.721900 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721896 2578 factory.go:103] Registering Raw factory Apr 16 16:24:04.722073 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.721910 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 16:24:04.722303 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.722284 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:24:04.722537 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.722524 2578 manager.go:319] Starting recovery of all containers Apr 16 16:24:04.723458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.723429 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-64.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:24:04.723557 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.723532 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:24:04.731026 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.731000 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8hgf2" Apr 16 16:24:04.733367 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.733352 2578 manager.go:324] Recovery completed Apr 16 16:24:04.737739 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.737725 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:24:04.740007 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.739995 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:24:04.740066 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.740021 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:24:04.740066 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.740031 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:24:04.740536 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.740524 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:24:04.740536 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.740535 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:24:04.740615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.740551 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:24:04.741042 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.741026 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8hgf2" Apr 16 16:24:04.742614 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.742549 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-64.ec2.internal.18a6e2f4f4220acb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-64.ec2.internal,UID:ip-10-0-128-64.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-64.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-64.ec2.internal,},FirstTimestamp:2026-04-16 16:24:04.740008651 +0000 UTC m=+0.462553032,LastTimestamp:2026-04-16 16:24:04.740008651 +0000 UTC m=+0.462553032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-64.ec2.internal,}" Apr 16 16:24:04.742824 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.742812 2578 policy_none.go:49] "None policy: Start" Apr 16 16:24:04.742866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.742829 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:24:04.742866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.742839 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:24:04.784736 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.784548 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.784762 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.784776 2578 server.go:85] "Starting device plugin registration server" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.785039 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.785050 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.785136 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.785210 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.785218 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.785754 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:24:04.792762 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.785788 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:04.848131 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.848094 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:24:04.849332 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.849305 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:24:04.849332 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.849334 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:24:04.849490 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.849357 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:24:04.849490 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.849365 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:24:04.849490 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.849397 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:24:04.853832 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.853810 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:24:04.886137 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.886076 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:24:04.887131 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.887114 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:24:04.887226 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.887144 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:24:04.887226 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.887156 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:24:04.887226 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.887186 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-64.ec2.internal" Apr 16 16:24:04.894764 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.894749 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-64.ec2.internal" Apr 16 16:24:04.894840 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.894770 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-64.ec2.internal\": node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:04.949519 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.949486 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal"] Apr 16 16:24:04.949679 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.949556 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:24:04.951214 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.951198 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:24:04.951306 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.951226 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:24:04.951306 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.951240 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:24:04.952645 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.952632 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:24:04.952812 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.952794 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:04.952871 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.952834 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:24:04.953489 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.953466 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:24:04.953557 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.953487 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:24:04.953557 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.953502 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:24:04.953557 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.953514 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:24:04.953647 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.953514 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:24:04.953647 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.953596 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:24:04.955272 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.955257 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" Apr 16 16:24:04.955314 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.955290 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:24:04.956128 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.956110 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:04.956496 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.956483 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:24:04.956568 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.956511 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:24:04.956568 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:04.956525 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:24:04.978481 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.978459 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-64.ec2.internal\" not found" node="ip-10-0-128-64.ec2.internal" Apr 16 16:24:04.982955 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:04.982937 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-64.ec2.internal\" not found" node="ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.056479 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.056451 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.123277 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.123251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b8a86ca79ec9db287bd2c4a7f1165a3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal\" (UID: \"b8a86ca79ec9db287bd2c4a7f1165a3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.123394 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.123283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8a86ca79ec9db287bd2c4a7f1165a3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal\" (UID: \"b8a86ca79ec9db287bd2c4a7f1165a3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.123394 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.123307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b52ab70ac24bb6b9ef3948f422fe6d61-config\") pod \"kube-apiserver-proxy-ip-10-0-128-64.ec2.internal\" (UID: \"b52ab70ac24bb6b9ef3948f422fe6d61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.156833 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.156760 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.224497 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.224413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b52ab70ac24bb6b9ef3948f422fe6d61-config\") pod \"kube-apiserver-proxy-ip-10-0-128-64.ec2.internal\" (UID: \"b52ab70ac24bb6b9ef3948f422fe6d61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.224572 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.224522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b8a86ca79ec9db287bd2c4a7f1165a3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal\" (UID: \"b8a86ca79ec9db287bd2c4a7f1165a3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.224572 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.224539 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8a86ca79ec9db287bd2c4a7f1165a3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal\" (UID: \"b8a86ca79ec9db287bd2c4a7f1165a3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.224572 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.224465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b52ab70ac24bb6b9ef3948f422fe6d61-config\") pod \"kube-apiserver-proxy-ip-10-0-128-64.ec2.internal\" (UID: \"b52ab70ac24bb6b9ef3948f422fe6d61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.224708 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.224578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8a86ca79ec9db287bd2c4a7f1165a3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal\" (UID: \"b8a86ca79ec9db287bd2c4a7f1165a3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.224708 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.224611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b8a86ca79ec9db287bd2c4a7f1165a3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal\" (UID: \"b8a86ca79ec9db287bd2c4a7f1165a3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.257563 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.257528 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.281719 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.281698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.286272 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.286250 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.358018 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.357978 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.458592 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.458519 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.558988 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.558968 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.628517 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.628490 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:24:05.628953 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.628623 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:24:05.659778 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:05.659751 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-64.ec2.internal\" not found" Apr 16 16:24:05.686993 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.686973 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:24:05.720849 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.720829 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:24:05.732552 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.732524 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:24:05.742466 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.742412 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:19:04 +0000 UTC" deadline="2027-10-10 02:19:08.387320676 +0000 UTC" Apr 16 16:24:05.742466 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.742457 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12993h55m2.64486812s" Apr 16 16:24:05.744948 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.744914 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:24:05.755364 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.755335 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n6g2x" Apr 16 16:24:05.765127 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.765102 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n6g2x" Apr 16 16:24:05.810960 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:05.810914 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a86ca79ec9db287bd2c4a7f1165a3f.slice/crio-1562c60d92d8009326ad906adfc69328e32b0404fe92a4252b86e46cc86191f6 WatchSource:0}: Error finding container 1562c60d92d8009326ad906adfc69328e32b0404fe92a4252b86e46cc86191f6: Status 404 returned error can't find the container with id 1562c60d92d8009326ad906adfc69328e32b0404fe92a4252b86e46cc86191f6 Apr 16 16:24:05.811426 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:05.811405 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb52ab70ac24bb6b9ef3948f422fe6d61.slice/crio-5c5bf8eb5e8274a94402e90a4924b27c8f3c268af45bd50c4235c7749516c73d WatchSource:0}: Error finding container 5c5bf8eb5e8274a94402e90a4924b27c8f3c268af45bd50c4235c7749516c73d: Status 404 returned error can't find the container with id 5c5bf8eb5e8274a94402e90a4924b27c8f3c268af45bd50c4235c7749516c73d Apr 16 16:24:05.815229 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.815216 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:24:05.821832 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.821815 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.832046 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.832015 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:24:05.834181 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.834167 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" Apr 16 16:24:05.844213 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.844193 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:24:05.852155 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.852107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" event={"ID":"b8a86ca79ec9db287bd2c4a7f1165a3f","Type":"ContainerStarted","Data":"1562c60d92d8009326ad906adfc69328e32b0404fe92a4252b86e46cc86191f6"} Apr 16 16:24:05.853153 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:05.853133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" event={"ID":"b52ab70ac24bb6b9ef3948f422fe6d61","Type":"ContainerStarted","Data":"5c5bf8eb5e8274a94402e90a4924b27c8f3c268af45bd50c4235c7749516c73d"} Apr 16 16:24:06.049216 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.049189 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:24:06.697957 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.697860 2578 apiserver.go:52] "Watching apiserver" Apr 16 16:24:06.709449 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.709421 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:24:06.710528 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.710503 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-x7ld8","kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct","openshift-cluster-node-tuning-operator/tuned-57m87","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal","openshift-multus/multus-lsqww","openshift-network-diagnostics/network-check-target-n26xz","openshift-network-operator/iptables-alerter-vjlfb","openshift-dns/node-resolver-4stxv","openshift-image-registry/node-ca-cv6tw","openshift-multus/multus-additional-cni-plugins-wflp7","openshift-multus/network-metrics-daemon-stfn4","openshift-ovn-kubernetes/ovnkube-node-dj5w9"] Apr 16 16:24:06.713816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.713786 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:06.713949 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.713855 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:06.714824 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.714804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.716120 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.716096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.716270 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.716247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.716994 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.716972 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jtp8d\"" Apr 16 16:24:06.717164 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.717147 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.717230 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.717190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.717453 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.717434 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.718152 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.718130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:24:06.718247 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.718231 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:24:06.718553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.718536 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.718858 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.718844 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:24:06.719078 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719053 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rnf7g\"" Apr 16 16:24:06.719258 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:24:06.719543 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719528 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qcdj8\"" Apr 16 16:24:06.719755 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719635 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.719755 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719642 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:24:06.719755 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719641 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-d5kdb\"" Apr 16 16:24:06.720074 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719811 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.720074 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.719854 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:24:06.720342 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.720325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.720427 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.720356 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.721856 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.721768 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.723019 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.723003 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.723144 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.723127 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.723247 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.723232 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:24:06.724472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.724173 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2q4vz\"" Apr 16 16:24:06.724472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.724195 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:24:06.724472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.724200 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.724472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.724244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.724472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.724184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-svp7v\"" Apr 16 16:24:06.725113 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.725086 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.725208 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.725091 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84d79\"" Apr 16 16:24:06.725208 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.725201 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.725683 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.725665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:06.725683 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.725676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.725799 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.725731 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:06.726547 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.726529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-f86px\"" Apr 16 16:24:06.726547 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.726541 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.726839 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.726824 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:24:06.726898 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.726830 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.730803 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.730784 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:24:06.730890 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.730876 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-62zc7\"" Apr 16 16:24:06.731119 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731011 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:24:06.731119 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731027 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:24:06.731119 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731118 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:24:06.731313 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:24:06.731733 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-etc-selinux\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.731828 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqscb\" (UniqueName: \"kubernetes.io/projected/e31ca37d-89f4-4fb4-9f61-d0df5565b487-kube-api-access-xqscb\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.731828 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e95b712d-7106-4990-96bd-48d8764b3a55-tmp\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.731828 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731791 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-os-release\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.731828 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-systemd\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.731828 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfmt\" (UniqueName: \"kubernetes.io/projected/e95b712d-7106-4990-96bd-48d8764b3a55-kube-api-access-mrfmt\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.731828 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731828 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-daemon-config\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-system-cni-dir\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-lib-modules\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.731994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d333c14-36c2-41c1-af1c-1c7a8daaee58-agent-certs\") pod \"konnectivity-agent-x7ld8\" (UID: \"9d333c14-36c2-41c1-af1c-1c7a8daaee58\") " pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-system-cni-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e6722ce7-f829-490e-9ee8-97b3a979ca09-hosts-file\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28sx\" (UniqueName: \"kubernetes.io/projected/e746431a-e308-4ccf-87fa-0969f2b40152-kube-api-access-s28sx\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.732136 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-kubernetes\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.732516 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.732516 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e746431a-e308-4ccf-87fa-0969f2b40152-host-slash\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.732719 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.732672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysctl-d\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d333c14-36c2-41c1-af1c-1c7a8daaee58-konnectivity-ca\") pod \"konnectivity-agent-x7ld8\" (UID: \"9d333c14-36c2-41c1-af1c-1c7a8daaee58\") " pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-registration-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shd8n\" (UniqueName: \"kubernetes.io/projected/e6722ce7-f829-490e-9ee8-97b3a979ca09-kube-api-access-shd8n\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-modprobe-d\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e95b712d-7106-4990-96bd-48d8764b3a55-etc-tuned\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-socket-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734828 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-host\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cni-binary-copy\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysctl-conf\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.734965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6722ce7-f829-490e-9ee8-97b3a979ca09-tmp-dir\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysconfig\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-sys-fs\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.736352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-cni-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-cnibin\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-k8s-cni-cncf-io\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-netns\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-cni-bin\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-kubelet\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-etc-kubernetes\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-os-release\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-run\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-sys\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-var-lib-kubelet\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvw7\" (UniqueName: \"kubernetes.io/projected/7f749d5c-04b9-4ecc-8853-c8deff057ad4-kube-api-access-vcvw7\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-device-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-cni-multus\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-hostroot\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.735941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e746431a-e308-4ccf-87fa-0969f2b40152-iptables-alerter-script\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.736001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-conf-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737077 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.736051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-cni-binary-copy\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.736109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-socket-dir-parent\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.736157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-multus-certs\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.736182 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkmf\" (UniqueName: \"kubernetes.io/projected/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-kube-api-access-tnkmf\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.737885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.736227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cnibin\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.765731 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.765699 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:19:05 +0000 UTC" deadline="2028-01-14 18:14:07.551454938 +0000 UTC" Apr 16 16:24:06.765731 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.765728 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15313h50m0.78573009s" Apr 16 16:24:06.822228 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.822196 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:24:06.836682 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysconfig\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.836682 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.836908 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-kubelet\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.836908 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-sys-fs\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.836908 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysconfig\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.836908 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-cni-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-cnibin\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-k8s-cni-cncf-io\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-netns\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-sys-fs\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.836994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-cni-bin\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-netns\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-cni-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-cni-bin\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-k8s-cni-cncf-io\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-kubelet\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-cnibin\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837118 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtslj\" (UniqueName: \"kubernetes.io/projected/8a2d315f-d3a3-4cab-97d9-becca3a12249-kube-api-access-vtslj\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-etc-kubernetes\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-os-release\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-kubelet\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837201 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-systemd-units\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-etc-kubernetes\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-systemd\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-log-socket\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-ovnkube-script-lib\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-os-release\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-run\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-sys\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-var-lib-kubelet\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvw7\" (UniqueName: \"kubernetes.io/projected/7f749d5c-04b9-4ecc-8853-c8deff057ad4-kube-api-access-vcvw7\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-run\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a2d315f-d3a3-4cab-97d9-becca3a12249-host\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-var-lib-kubelet\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.837618 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-etc-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-node-log\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-sys\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-device-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-device-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-cni-multus\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-hostroot\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e746431a-e308-4ccf-87fa-0969f2b40152-iptables-alerter-script\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-var-lib-cni-multus\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-var-lib-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-hostroot\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-conf-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8a2d315f-d3a3-4cab-97d9-becca3a12249-serviceca\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-conf-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-cni-bin\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.838383 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-cni-binary-copy\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-socket-dir-parent\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-multus-certs\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-host-run-multus-certs\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.837993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-socket-dir-parent\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkmf\" (UniqueName: \"kubernetes.io/projected/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-kube-api-access-tnkmf\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cnibin\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838082 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ff3f6de-097e-4812-8f8e-276c41254178-ovn-node-metrics-cert\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-etc-selinux\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqscb\" (UniqueName: \"kubernetes.io/projected/e31ca37d-89f4-4fb4-9f61-d0df5565b487-kube-api-access-xqscb\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e95b712d-7106-4990-96bd-48d8764b3a55-tmp\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-os-release\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-systemd\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfmt\" (UniqueName: \"kubernetes.io/projected/e95b712d-7106-4990-96bd-48d8764b3a55-kube-api-access-mrfmt\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-daemon-config\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-system-cni-dir\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.839167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e746431a-e308-4ccf-87fa-0969f2b40152-iptables-alerter-script\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-lib-modules\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d333c14-36c2-41c1-af1c-1c7a8daaee58-agent-certs\") pod \"konnectivity-agent-x7ld8\" (UID: \"9d333c14-36c2-41c1-af1c-1c7a8daaee58\") " pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-system-cni-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e6722ce7-f829-490e-9ee8-97b3a979ca09-hosts-file\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-cni-binary-copy\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e6722ce7-f829-490e-9ee8-97b3a979ca09-hosts-file\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-system-cni-dir\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-systemd\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838590 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-lib-modules\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-os-release\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-system-cni-dir\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cnibin\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s28sx\" (UniqueName: \"kubernetes.io/projected/e746431a-e308-4ccf-87fa-0969f2b40152-kube-api-access-s28sx\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-etc-selinux\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838828 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-kubernetes\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.839937 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-multus-daemon-config\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e746431a-e308-4ccf-87fa-0969f2b40152-host-slash\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-kubernetes\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.838989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74rw\" (UniqueName: \"kubernetes.io/projected/0b0c36b6-3279-4629-991c-70026ff0d0b6-kube-api-access-j74rw\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysctl-d\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d333c14-36c2-41c1-af1c-1c7a8daaee58-konnectivity-ca\") pod \"konnectivity-agent-x7ld8\" (UID: \"9d333c14-36c2-41c1-af1c-1c7a8daaee58\") " pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e746431a-e308-4ccf-87fa-0969f2b40152-host-slash\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-registration-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shd8n\" (UniqueName: \"kubernetes.io/projected/e6722ce7-f829-490e-9ee8-97b3a979ca09-kube-api-access-shd8n\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-registration-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysctl-d\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839408 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-slash\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-run-netns\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-ovnkube-config\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.840625 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-modprobe-d\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e95b712d-7106-4990-96bd-48d8764b3a55-etc-tuned\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d333c14-36c2-41c1-af1c-1c7a8daaee58-konnectivity-ca\") pod \"konnectivity-agent-x7ld8\" (UID: \"9d333c14-36c2-41c1-af1c-1c7a8daaee58\") " pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-modprobe-d\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-socket-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlncf\" (UniqueName: \"kubernetes.io/projected/3ff3f6de-097e-4812-8f8e-276c41254178-kube-api-access-dlncf\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-ovn\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-cni-netd\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-env-overrides\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e31ca37d-89f4-4fb4-9f61-d0df5565b487-socket-dir\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-host\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.839992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cni-binary-copy\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysctl-conf\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6722ce7-f829-490e-9ee8-97b3a979ca09-tmp-dir\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-run-ovn-kubernetes\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.841188 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-host\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841833 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e95b712d-7106-4990-96bd-48d8764b3a55-etc-sysctl-conf\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.841833 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.841833 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6722ce7-f829-490e-9ee8-97b3a979ca09-tmp-dir\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.841833 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-cni-binary-copy\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.841833 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.840849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7f749d5c-04b9-4ecc-8853-c8deff057ad4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.842393 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.842368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e95b712d-7106-4990-96bd-48d8764b3a55-tmp\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.842495 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.842473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e95b712d-7106-4990-96bd-48d8764b3a55-etc-tuned\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.842563 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.842549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d333c14-36c2-41c1-af1c-1c7a8daaee58-agent-certs\") pod \"konnectivity-agent-x7ld8\" (UID: \"9d333c14-36c2-41c1-af1c-1c7a8daaee58\") " pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:06.849299 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.849094 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:06.849299 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.849121 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:06.849299 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.849135 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:06.849299 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.849259 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:07.349198678 +0000 UTC m=+3.071743067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:06.849591 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.849452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkmf\" (UniqueName: \"kubernetes.io/projected/0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59-kube-api-access-tnkmf\") pod \"multus-lsqww\" (UID: \"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59\") " pod="openshift-multus/multus-lsqww" Apr 16 16:24:06.849989 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.849965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvw7\" (UniqueName: \"kubernetes.io/projected/7f749d5c-04b9-4ecc-8853-c8deff057ad4-kube-api-access-vcvw7\") pod \"multus-additional-cni-plugins-wflp7\" (UID: \"7f749d5c-04b9-4ecc-8853-c8deff057ad4\") " pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:06.851573 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.851547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfmt\" (UniqueName: \"kubernetes.io/projected/e95b712d-7106-4990-96bd-48d8764b3a55-kube-api-access-mrfmt\") pod \"tuned-57m87\" (UID: \"e95b712d-7106-4990-96bd-48d8764b3a55\") " pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:06.852022 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.851998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28sx\" (UniqueName: \"kubernetes.io/projected/e746431a-e308-4ccf-87fa-0969f2b40152-kube-api-access-s28sx\") pod \"iptables-alerter-vjlfb\" (UID: \"e746431a-e308-4ccf-87fa-0969f2b40152\") " pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:06.852380 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.852353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shd8n\" (UniqueName: \"kubernetes.io/projected/e6722ce7-f829-490e-9ee8-97b3a979ca09-kube-api-access-shd8n\") pod \"node-resolver-4stxv\" (UID: \"e6722ce7-f829-490e-9ee8-97b3a979ca09\") " pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:06.852454 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.852429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqscb\" (UniqueName: \"kubernetes.io/projected/e31ca37d-89f4-4fb4-9f61-d0df5565b487-kube-api-access-xqscb\") pod \"aws-ebs-csi-driver-node-qztct\" (UID: \"e31ca37d-89f4-4fb4-9f61-d0df5565b487\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:06.864097 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.864073 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:24:06.941026 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.940994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-env-overrides\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941026 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-run-ovn-kubernetes\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-kubelet\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-run-ovn-kubernetes\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtslj\" (UniqueName: \"kubernetes.io/projected/8a2d315f-d3a3-4cab-97d9-becca3a12249-kube-api-access-vtslj\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-systemd-units\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-systemd\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-log-socket\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-kubelet\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-log-socket\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941233 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-systemd\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-systemd-units\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-ovnkube-script-lib\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a2d315f-d3a3-4cab-97d9-becca3a12249-host\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-etc-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a2d315f-d3a3-4cab-97d9-becca3a12249-host\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-node-log\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-etc-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-var-lib-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-node-log\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-var-lib-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8a2d315f-d3a3-4cab-97d9-becca3a12249-serviceca\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-openvswitch\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-cni-bin\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-cni-bin\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ff3f6de-097e-4812-8f8e-276c41254178-ovn-node-metrics-cert\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-env-overrides\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.941689 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j74rw\" (UniqueName: \"kubernetes.io/projected/0b0c36b6-3279-4629-991c-70026ff0d0b6-kube-api-access-j74rw\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-slash\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-run-netns\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.941658 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-ovnkube-config\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:06.941723 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:07.441704082 +0000 UTC m=+3.164248451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlncf\" (UniqueName: \"kubernetes.io/projected/3ff3f6de-097e-4812-8f8e-276c41254178-kube-api-access-dlncf\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-ovn\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-cni-netd\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8a2d315f-d3a3-4cab-97d9-becca3a12249-serviceca\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941865 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-cni-netd\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-slash\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941913 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-run-ovn\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941981 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ff3f6de-097e-4812-8f8e-276c41254178-host-run-netns\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942368 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.941992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-ovnkube-script-lib\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.942873 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.942153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ff3f6de-097e-4812-8f8e-276c41254178-ovnkube-config\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.943656 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.943636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ff3f6de-097e-4812-8f8e-276c41254178-ovn-node-metrics-cert\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.952257 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.952201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtslj\" (UniqueName: \"kubernetes.io/projected/8a2d315f-d3a3-4cab-97d9-becca3a12249-kube-api-access-vtslj\") pod \"node-ca-cv6tw\" (UID: \"8a2d315f-d3a3-4cab-97d9-becca3a12249\") " pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:06.952429 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.952405 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlncf\" (UniqueName: \"kubernetes.io/projected/3ff3f6de-097e-4812-8f8e-276c41254178-kube-api-access-dlncf\") pod \"ovnkube-node-dj5w9\" (UID: \"3ff3f6de-097e-4812-8f8e-276c41254178\") " pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:06.952557 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:06.952543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74rw\" (UniqueName: \"kubernetes.io/projected/0b0c36b6-3279-4629-991c-70026ff0d0b6-kube-api-access-j74rw\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:07.026642 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.026606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-57m87" Apr 16 16:24:07.034374 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.034345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:07.043239 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.043218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wflp7" Apr 16 16:24:07.048849 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.048824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" Apr 16 16:24:07.055432 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.055414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vjlfb" Apr 16 16:24:07.061025 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.061008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lsqww" Apr 16 16:24:07.067564 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.067546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4stxv" Apr 16 16:24:07.074108 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.074092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cv6tw" Apr 16 16:24:07.079653 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.079635 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:07.445108 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.445086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:07.445199 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.445118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:07.445253 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:07.445237 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:07.445298 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:07.445257 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:07.445298 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:07.445266 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:07.445298 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:07.445280 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:07.445387 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:07.445351 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:08.445300593 +0000 UTC m=+4.167844966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:07.445387 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:07.445370 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:08.445361048 +0000 UTC m=+4.167905428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:07.467302 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.467267 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2d315f_d3a3_4cab_97d9_becca3a12249.slice/crio-ffe52c3cd219f8e670a5a63973c43d450841740b7684719522d2f00028a48328 WatchSource:0}: Error finding container ffe52c3cd219f8e670a5a63973c43d450841740b7684719522d2f00028a48328: Status 404 returned error can't find the container with id ffe52c3cd219f8e670a5a63973c43d450841740b7684719522d2f00028a48328 Apr 16 16:24:07.470795 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.470769 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6722ce7_f829_490e_9ee8_97b3a979ca09.slice/crio-db331e387fd5ab9e06efea891c07074c0407f01344e54ae5f2654327a13268e2 WatchSource:0}: Error finding container db331e387fd5ab9e06efea891c07074c0407f01344e54ae5f2654327a13268e2: Status 404 returned error can't find the container with id db331e387fd5ab9e06efea891c07074c0407f01344e54ae5f2654327a13268e2 Apr 16 16:24:07.472602 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.472577 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff3f6de_097e_4812_8f8e_276c41254178.slice/crio-2e951222ec108e86681f4dd8de78132ff3a1f9b26839e40cf8c7cbf7189293d3 WatchSource:0}: Error finding container 2e951222ec108e86681f4dd8de78132ff3a1f9b26839e40cf8c7cbf7189293d3: Status 404 returned error can't find the container with id 2e951222ec108e86681f4dd8de78132ff3a1f9b26839e40cf8c7cbf7189293d3 Apr 16 16:24:07.473534 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.473506 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode746431a_e308_4ccf_87fa_0969f2b40152.slice/crio-fec206972e753bec79c0b29a8556728ea9bcfa09eb734fcf21cee7e678ef01c8 WatchSource:0}: Error finding container fec206972e753bec79c0b29a8556728ea9bcfa09eb734fcf21cee7e678ef01c8: Status 404 returned error can't find the container with id fec206972e753bec79c0b29a8556728ea9bcfa09eb734fcf21cee7e678ef01c8 Apr 16 16:24:07.474454 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.474366 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode95b712d_7106_4990_96bd_48d8764b3a55.slice/crio-595ab1bfa2b509ff04f011dac962cf496b29ef829d700e7fb4ff010edb7bfcbd WatchSource:0}: Error finding container 595ab1bfa2b509ff04f011dac962cf496b29ef829d700e7fb4ff010edb7bfcbd: Status 404 returned error can't find the container with id 595ab1bfa2b509ff04f011dac962cf496b29ef829d700e7fb4ff010edb7bfcbd Apr 16 16:24:07.475535 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.475493 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cad9b82_f6b2_4330_bd44_9e2cbb2b5e59.slice/crio-36350712931e0bb001f2bdb8fbb6244ef3be33ec17af0aeea5ae4e980fff3d8f WatchSource:0}: Error finding container 36350712931e0bb001f2bdb8fbb6244ef3be33ec17af0aeea5ae4e980fff3d8f: Status 404 returned error can't find the container with id 36350712931e0bb001f2bdb8fbb6244ef3be33ec17af0aeea5ae4e980fff3d8f Apr 16 16:24:07.476574 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.476389 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f749d5c_04b9_4ecc_8853_c8deff057ad4.slice/crio-a4a7c172a5c9f653c3312038d65a22453f09f70c4f492c71737f3cc01d84d82a WatchSource:0}: Error finding container a4a7c172a5c9f653c3312038d65a22453f09f70c4f492c71737f3cc01d84d82a: Status 404 returned error can't find the container with id a4a7c172a5c9f653c3312038d65a22453f09f70c4f492c71737f3cc01d84d82a Apr 16 16:24:07.477030 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.477009 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode31ca37d_89f4_4fb4_9f61_d0df5565b487.slice/crio-fb7c0a62878c33af2fcba1aebae83987f38b182b3ed1fa6ce73af23e2daf84f0 WatchSource:0}: Error finding container fb7c0a62878c33af2fcba1aebae83987f38b182b3ed1fa6ce73af23e2daf84f0: Status 404 returned error can't find the container with id fb7c0a62878c33af2fcba1aebae83987f38b182b3ed1fa6ce73af23e2daf84f0 Apr 16 16:24:07.478170 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:07.478146 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d333c14_36c2_41c1_af1c_1c7a8daaee58.slice/crio-3d7c17c42b339016731fd18ca7c6a8613d6a0a238ebb2240acfa724168220774 WatchSource:0}: Error finding container 3d7c17c42b339016731fd18ca7c6a8613d6a0a238ebb2240acfa724168220774: Status 404 returned error can't find the container with id 3d7c17c42b339016731fd18ca7c6a8613d6a0a238ebb2240acfa724168220774 Apr 16 16:24:07.766424 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.766226 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:19:05 +0000 UTC" deadline="2027-11-14 12:10:25.511707123 +0000 UTC" Apr 16 16:24:07.766424 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.766419 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13843h46m17.745290873s" Apr 16 16:24:07.857795 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.857757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-57m87" event={"ID":"e95b712d-7106-4990-96bd-48d8764b3a55","Type":"ContainerStarted","Data":"595ab1bfa2b509ff04f011dac962cf496b29ef829d700e7fb4ff010edb7bfcbd"} Apr 16 16:24:07.859008 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.858954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"2e951222ec108e86681f4dd8de78132ff3a1f9b26839e40cf8c7cbf7189293d3"} Apr 16 16:24:07.859983 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.859960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4stxv" event={"ID":"e6722ce7-f829-490e-9ee8-97b3a979ca09","Type":"ContainerStarted","Data":"db331e387fd5ab9e06efea891c07074c0407f01344e54ae5f2654327a13268e2"} Apr 16 16:24:07.860960 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.860912 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x7ld8" event={"ID":"9d333c14-36c2-41c1-af1c-1c7a8daaee58","Type":"ContainerStarted","Data":"3d7c17c42b339016731fd18ca7c6a8613d6a0a238ebb2240acfa724168220774"} Apr 16 16:24:07.862115 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.862078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerStarted","Data":"a4a7c172a5c9f653c3312038d65a22453f09f70c4f492c71737f3cc01d84d82a"} Apr 16 16:24:07.863140 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.863117 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vjlfb" event={"ID":"e746431a-e308-4ccf-87fa-0969f2b40152","Type":"ContainerStarted","Data":"fec206972e753bec79c0b29a8556728ea9bcfa09eb734fcf21cee7e678ef01c8"} Apr 16 16:24:07.864407 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.864385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cv6tw" event={"ID":"8a2d315f-d3a3-4cab-97d9-becca3a12249","Type":"ContainerStarted","Data":"ffe52c3cd219f8e670a5a63973c43d450841740b7684719522d2f00028a48328"} Apr 16 16:24:07.866035 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.866011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" event={"ID":"b52ab70ac24bb6b9ef3948f422fe6d61","Type":"ContainerStarted","Data":"6c5db03aa3a6440c90de759c156f8c5dd2519e0d6c2f95e58f461c1f5ef92dde"} Apr 16 16:24:07.867431 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.867407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" event={"ID":"e31ca37d-89f4-4fb4-9f61-d0df5565b487","Type":"ContainerStarted","Data":"fb7c0a62878c33af2fcba1aebae83987f38b182b3ed1fa6ce73af23e2daf84f0"} Apr 16 16:24:07.868668 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.868644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lsqww" event={"ID":"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59","Type":"ContainerStarted","Data":"36350712931e0bb001f2bdb8fbb6244ef3be33ec17af0aeea5ae4e980fff3d8f"} Apr 16 16:24:07.878418 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:07.878370 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-64.ec2.internal" podStartSLOduration=2.878353737 podStartE2EDuration="2.878353737s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:07.877995759 +0000 UTC m=+3.600540149" watchObservedRunningTime="2026-04-16 16:24:07.878353737 +0000 UTC m=+3.600898129" Apr 16 16:24:08.453139 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:08.452580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:08.453139 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:08.452629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:08.453139 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.452767 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:08.453139 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.452967 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:08.453139 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.452984 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:08.453139 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.452996 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:08.453722 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.453639 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:10.452808276 +0000 UTC m=+6.175352666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:08.453722 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.453670 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:10.453656021 +0000 UTC m=+6.176200396 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:08.850103 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:08.850070 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:08.850552 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.850205 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:08.850552 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:08.850292 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:08.850552 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:08.850399 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:08.885600 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:08.885562 2578 generic.go:358] "Generic (PLEG): container finished" podID="b8a86ca79ec9db287bd2c4a7f1165a3f" containerID="caca4575503fa5885afaf0f88802e8c25cac3ee8ff8f5d842a97236996616ecd" exitCode=0 Apr 16 16:24:08.886622 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:08.886589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" event={"ID":"b8a86ca79ec9db287bd2c4a7f1165a3f","Type":"ContainerDied","Data":"caca4575503fa5885afaf0f88802e8c25cac3ee8ff8f5d842a97236996616ecd"} Apr 16 16:24:09.891426 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:09.891390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" event={"ID":"b8a86ca79ec9db287bd2c4a7f1165a3f","Type":"ContainerStarted","Data":"1f137dc3c465e11c7e8985f293e22adadbcba8322055a5a82539bda585e4497b"} Apr 16 16:24:09.913166 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:09.913110 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-64.ec2.internal" podStartSLOduration=4.913089172 podStartE2EDuration="4.913089172s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:09.912301022 +0000 UTC m=+5.634845414" watchObservedRunningTime="2026-04-16 16:24:09.913089172 +0000 UTC m=+5.635633563" Apr 16 16:24:10.470945 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:10.470818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:10.470945 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:10.470885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:10.471157 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.471001 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:10.471157 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.471026 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:10.471157 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.471040 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:10.471157 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.471104 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:14.471083738 +0000 UTC m=+10.193628119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:10.471529 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.471504 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:10.471669 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.471569 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:14.471552952 +0000 UTC m=+10.194097324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:10.850072 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:10.850040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:10.850263 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:10.850040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:10.850263 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.850197 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:10.850263 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:10.850238 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:12.849977 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:12.849580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:12.849977 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:12.849720 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:12.850935 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:12.850612 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:12.850935 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:12.850715 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:14.508166 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:14.508126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:14.508184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.508305 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.508330 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.508335 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.508345 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.508396 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:22.508376921 +0000 UTC m=+18.230921291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:14.508633 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.508415 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:22.508405026 +0000 UTC m=+18.230949397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:14.851818 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:14.851228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:14.851818 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.851334 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:14.851818 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:14.851382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:14.851818 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:14.851448 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:16.849883 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:16.849833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:16.849883 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:16.849870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:16.850382 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:16.850016 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:16.850382 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:16.850149 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:18.850617 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:18.850578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:18.851080 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:18.850715 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:18.851080 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:18.850578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:18.851080 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:18.850871 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:20.850447 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:20.850408 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:20.850915 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:20.850408 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:20.850915 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:20.850565 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:20.850915 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:20.850594 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:22.561968 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:22.561914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:22.561977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.562095 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.562166 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.562144558 +0000 UTC m=+34.284688928 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.562095 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.562210 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.562225 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:22.562458 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.562285 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.562271829 +0000 UTC m=+34.284816200 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:22.850459 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:22.850381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:22.850631 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:22.850389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:22.850631 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.850528 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:22.850631 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:22.850581 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:24.850581 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.850274 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:24.851372 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.850364 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:24.851372 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:24.850657 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:24.851372 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:24.850772 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:24.918323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.918291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-57m87" event={"ID":"e95b712d-7106-4990-96bd-48d8764b3a55","Type":"ContainerStarted","Data":"ea24bbe8b3a5a75ddfa3dff01829d25bd2307cc6aa3c66a442eb2d79d7b070ff"} Apr 16 16:24:24.921994 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.921969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:24:24.922319 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.922298 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ff3f6de-097e-4812-8f8e-276c41254178" containerID="3b2f2f67d55f5a75d97dd29c2e0108c477cfe8abcb965991b1256d2f9f2ae643" exitCode=1 Apr 16 16:24:24.922399 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.922361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"e2d74800859643962aeaaadfe916d2453f17053a4f7e43822ca061f2b7dce5f5"} Apr 16 16:24:24.922399 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.922380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"f3e179403a7a6f77217206ce43caca7648e05421dd7f95bfb822ebfb10c87328"} Apr 16 16:24:24.922399 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.922390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerDied","Data":"3b2f2f67d55f5a75d97dd29c2e0108c477cfe8abcb965991b1256d2f9f2ae643"} Apr 16 16:24:24.922399 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.922399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"3e948bc72136107ecdaa3c32937e8a69a1d1ee497bca7b1dd24afd64f53fad88"} Apr 16 16:24:24.923951 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.923894 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4stxv" event={"ID":"e6722ce7-f829-490e-9ee8-97b3a979ca09","Type":"ContainerStarted","Data":"800fab1ed79b8b6dd3ed984229cdb9825ea119a1a6d1aae514c697a282239520"} Apr 16 16:24:24.925324 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.925294 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x7ld8" event={"ID":"9d333c14-36c2-41c1-af1c-1c7a8daaee58","Type":"ContainerStarted","Data":"030e7622d94b259e8355625787eddc72a263199a1ad449d90dbc5b92a02611b9"} Apr 16 16:24:24.926815 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.926790 2578 generic.go:358] "Generic (PLEG): container finished" podID="7f749d5c-04b9-4ecc-8853-c8deff057ad4" containerID="a75a3e6304589a0ea7634efa54a934ae9ac93e777cc31c757dfc88d734c701b2" exitCode=0 Apr 16 16:24:24.926892 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.926850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerDied","Data":"a75a3e6304589a0ea7634efa54a934ae9ac93e777cc31c757dfc88d734c701b2"} Apr 16 16:24:24.928273 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.928244 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cv6tw" event={"ID":"8a2d315f-d3a3-4cab-97d9-becca3a12249","Type":"ContainerStarted","Data":"d154cca6afa553a3ccfe4f4024b1b89b992238950908ef4961eba21599a2c1f6"} Apr 16 16:24:24.929798 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.929768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" event={"ID":"e31ca37d-89f4-4fb4-9f61-d0df5565b487","Type":"ContainerStarted","Data":"062dd7c2f08e697ffd32f1d7978e2d5341fdc0688036dd4682cc2666a4123d05"} Apr 16 16:24:24.931150 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.931125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lsqww" event={"ID":"0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59","Type":"ContainerStarted","Data":"2ed1bb8a8215140e7171fa18a4ee6c4769b6c3657baa6217bacf86837689bc18"} Apr 16 16:24:24.936481 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.936431 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-57m87" podStartSLOduration=4.068337584 podStartE2EDuration="20.936418372s" podCreationTimestamp="2026-04-16 16:24:04 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.476397366 +0000 UTC m=+3.198941741" lastFinishedPulling="2026-04-16 16:24:24.34447815 +0000 UTC m=+20.067022529" observedRunningTime="2026-04-16 16:24:24.93622846 +0000 UTC m=+20.658772851" watchObservedRunningTime="2026-04-16 16:24:24.936418372 +0000 UTC m=+20.658962762" Apr 16 16:24:24.952878 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:24.952841 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-x7ld8" podStartSLOduration=4.088491821 podStartE2EDuration="20.952829282s" podCreationTimestamp="2026-04-16 16:24:04 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.480041758 +0000 UTC m=+3.202586131" lastFinishedPulling="2026-04-16 16:24:24.344379221 +0000 UTC m=+20.066923592" observedRunningTime="2026-04-16 16:24:24.952415651 +0000 UTC m=+20.674960041" watchObservedRunningTime="2026-04-16 16:24:24.952829282 +0000 UTC m=+20.675373673" Apr 16 16:24:25.000228 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.000132 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lsqww" podStartSLOduration=2.894485516 podStartE2EDuration="20.000118863s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.477575161 +0000 UTC m=+3.200119533" lastFinishedPulling="2026-04-16 16:24:24.5832085 +0000 UTC m=+20.305752880" observedRunningTime="2026-04-16 16:24:24.977209534 +0000 UTC m=+20.699753924" watchObservedRunningTime="2026-04-16 16:24:25.000118863 +0000 UTC m=+20.722663264" Apr 16 16:24:25.016014 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.015964 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4stxv" podStartSLOduration=3.093828864 podStartE2EDuration="20.015948845s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.472370991 +0000 UTC m=+3.194915360" lastFinishedPulling="2026-04-16 16:24:24.394490952 +0000 UTC m=+20.117035341" observedRunningTime="2026-04-16 16:24:25.015448706 +0000 UTC m=+20.737993095" watchObservedRunningTime="2026-04-16 16:24:25.015948845 +0000 UTC m=+20.738493236" Apr 16 16:24:25.030800 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.030757 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cv6tw" podStartSLOduration=3.1564788249999998 podStartE2EDuration="20.030744198s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.470044162 +0000 UTC m=+3.192588531" lastFinishedPulling="2026-04-16 16:24:24.344309533 +0000 UTC m=+20.066853904" observedRunningTime="2026-04-16 16:24:25.03021365 +0000 UTC m=+20.752758039" watchObservedRunningTime="2026-04-16 16:24:25.030744198 +0000 UTC m=+20.753288588" Apr 16 16:24:25.536020 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.535949 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:24:25.795803 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.795687 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:24:25.535971121Z","UUID":"6c5e811d-c44a-4a39-a6db-5066d253a2cb","Handler":null,"Name":"","Endpoint":""} Apr 16 16:24:25.799028 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.799002 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:24:25.799181 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.799035 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:24:25.936879 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.936841 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" event={"ID":"e31ca37d-89f4-4fb4-9f61-d0df5565b487","Type":"ContainerStarted","Data":"a60de5567904a1d75f7001510fca0ac68737ea964f4ed38267c46b538ed3186f"} Apr 16 16:24:25.939821 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.939799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:24:25.940254 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.940227 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"cdcfe96a8c1fcc9e5694d69643cf92934d956aa38d420bc60d5bcaa816898979"} Apr 16 16:24:25.940332 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.940264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"fd4bbd5e4e8d0a88984e4f397527cbaf8b862d45c7e303eb5b52fe590c12dc99"} Apr 16 16:24:25.941748 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:25.941722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vjlfb" event={"ID":"e746431a-e308-4ccf-87fa-0969f2b40152","Type":"ContainerStarted","Data":"c63d5365a5f329d01d6c601c77458e5136028705d6d7ba0540cdcf2008f23df4"} Apr 16 16:24:26.850490 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:26.850458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:26.850704 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:26.850498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:26.850704 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:26.850590 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:26.850704 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:26.850683 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:26.946256 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:26.946222 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" event={"ID":"e31ca37d-89f4-4fb4-9f61-d0df5565b487","Type":"ContainerStarted","Data":"c8cf76d5f81d7dd70bc8daa5568a435c47b2e8ffa3948006034a58b7da578321"} Apr 16 16:24:26.978393 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:26.978337 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vjlfb" podStartSLOduration=5.060787875 podStartE2EDuration="21.978319729s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.475494831 +0000 UTC m=+3.198039201" lastFinishedPulling="2026-04-16 16:24:24.393026682 +0000 UTC m=+20.115571055" observedRunningTime="2026-04-16 16:24:25.957759998 +0000 UTC m=+21.680304391" watchObservedRunningTime="2026-04-16 16:24:26.978319729 +0000 UTC m=+22.700864170" Apr 16 16:24:26.978892 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:26.978857 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qztct" podStartSLOduration=3.093779891 podStartE2EDuration="21.978846175s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.479756164 +0000 UTC m=+3.202300538" lastFinishedPulling="2026-04-16 16:24:26.36482245 +0000 UTC m=+22.087366822" observedRunningTime="2026-04-16 16:24:26.977810396 +0000 UTC m=+22.700354809" watchObservedRunningTime="2026-04-16 16:24:26.978846175 +0000 UTC m=+22.701390565" Apr 16 16:24:27.950656 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:27.950627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:24:27.951138 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:27.950985 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"f969a7e0d98fa0fd4fbfe5d5a02b2aac2f67bb12c3e176537965471857182f1a"} Apr 16 16:24:28.849809 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:28.849771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:28.850024 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:28.849935 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:28.850024 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:28.849789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:28.850152 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:28.850074 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:29.550736 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.550558 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:29.551676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.551387 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:29.957201 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.957176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:24:29.957530 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.957504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"8d6674a30d68a5079aeafdcd48cee4c2cfa9610bb71c5226350d035773f41830"} Apr 16 16:24:29.957840 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.957816 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:29.958022 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.958008 2578 scope.go:117] "RemoveContainer" containerID="3b2f2f67d55f5a75d97dd29c2e0108c477cfe8abcb965991b1256d2f9f2ae643" Apr 16 16:24:29.959313 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.959290 2578 generic.go:358] "Generic (PLEG): container finished" podID="7f749d5c-04b9-4ecc-8853-c8deff057ad4" containerID="5f89a0eb5a9ed2cec086e4cbde7aabe6595d769857890f859a4d85a416ef167a" exitCode=0 Apr 16 16:24:29.959408 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.959373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerDied","Data":"5f89a0eb5a9ed2cec086e4cbde7aabe6595d769857890f859a4d85a416ef167a"} Apr 16 16:24:29.959652 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.959623 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:29.960252 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.960222 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-x7ld8" Apr 16 16:24:29.973679 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:29.973657 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:30.849760 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.849579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:30.850117 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.849579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:30.850117 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:30.849868 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:30.850117 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:30.849895 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:30.965666 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.965597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:24:30.966042 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.966005 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" event={"ID":"3ff3f6de-097e-4812-8f8e-276c41254178","Type":"ContainerStarted","Data":"2f3808ecdee2661e6734e5bb39eecda837243001e4548440e2d9b9125c51b8f6"} Apr 16 16:24:30.966465 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.966406 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:30.966465 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.966443 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:30.968478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.968451 2578 generic.go:358] "Generic (PLEG): container finished" podID="7f749d5c-04b9-4ecc-8853-c8deff057ad4" containerID="ee7b1a075660153ebf4897f4e0cde8d67dc7352d6f749eeb4e7a5f185b6401ca" exitCode=0 Apr 16 16:24:30.968586 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.968539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerDied","Data":"ee7b1a075660153ebf4897f4e0cde8d67dc7352d6f749eeb4e7a5f185b6401ca"} Apr 16 16:24:30.983221 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:30.983197 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:24:31.008479 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.008426 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" podStartSLOduration=9.048274306 podStartE2EDuration="26.0084098s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.474646663 +0000 UTC m=+3.197191045" lastFinishedPulling="2026-04-16 16:24:24.434782155 +0000 UTC m=+20.157326539" observedRunningTime="2026-04-16 16:24:31.007647885 +0000 UTC m=+26.730192274" watchObservedRunningTime="2026-04-16 16:24:31.0084098 +0000 UTC m=+26.730954191" Apr 16 16:24:31.238000 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.237917 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n26xz"] Apr 16 16:24:31.238148 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.238024 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:31.238148 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:31.238100 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:31.244558 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.244530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-stfn4"] Apr 16 16:24:31.244701 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.244628 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:31.244764 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:31.244722 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:31.972909 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.972827 2578 generic.go:358] "Generic (PLEG): container finished" podID="7f749d5c-04b9-4ecc-8853-c8deff057ad4" containerID="40033e8e3d253b0e4bf51e48712100ad85e3996b9183b8e01b1b5c6b14ddb486" exitCode=0 Apr 16 16:24:31.973393 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:31.972911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerDied","Data":"40033e8e3d253b0e4bf51e48712100ad85e3996b9183b8e01b1b5c6b14ddb486"} Apr 16 16:24:32.850635 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:32.850603 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:32.850823 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:32.850606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:32.850823 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:32.850743 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:32.850967 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:32.850826 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:32.925076 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:32.925049 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4stxv_e6722ce7-f829-490e-9ee8-97b3a979ca09/dns-node-resolver/0.log" Apr 16 16:24:34.107168 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:34.107141 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cv6tw_8a2d315f-d3a3-4cab-97d9-becca3a12249/node-ca/0.log" Apr 16 16:24:34.854338 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:34.854298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:34.854547 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:34.854429 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:34.854547 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:34.854486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:34.854675 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:34.854595 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:36.850381 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:36.850160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:36.850842 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:36.850230 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:36.850842 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:36.850476 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:36.850842 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:36.850570 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:37.986791 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:37.986760 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerStarted","Data":"173f64c970b078824b031bb49f237c4f4b1469e28fc7557bfc4da627fca1c3c1"} Apr 16 16:24:38.572848 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:38.572809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:38.572848 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:38.572853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:38.573213 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.572957 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:38.573213 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.573019 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs podName:0b0c36b6-3279-4629-991c-70026ff0d0b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:10.572996299 +0000 UTC m=+66.295540672 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs") pod "network-metrics-daemon-stfn4" (UID: "0b0c36b6-3279-4629-991c-70026ff0d0b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:38.573213 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.573029 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:38.573213 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.573052 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:38.573213 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.573066 2578 projected.go:194] Error preparing data for projected volume kube-api-access-c9vnx for pod openshift-network-diagnostics/network-check-target-n26xz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:38.573213 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.573126 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx podName:eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e nodeName:}" failed. No retries permitted until 2026-04-16 16:25:10.573110247 +0000 UTC m=+66.295654628 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9vnx" (UniqueName: "kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx") pod "network-check-target-n26xz" (UID: "eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:38.849942 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:38.849830 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:38.849942 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:38.849831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:38.850174 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.849968 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:38.850174 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:38.850004 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:38.990981 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:38.990950 2578 generic.go:358] "Generic (PLEG): container finished" podID="7f749d5c-04b9-4ecc-8853-c8deff057ad4" containerID="173f64c970b078824b031bb49f237c4f4b1469e28fc7557bfc4da627fca1c3c1" exitCode=0 Apr 16 16:24:38.991519 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:38.991017 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerDied","Data":"173f64c970b078824b031bb49f237c4f4b1469e28fc7557bfc4da627fca1c3c1"} Apr 16 16:24:39.995222 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:39.995192 2578 generic.go:358] "Generic (PLEG): container finished" podID="7f749d5c-04b9-4ecc-8853-c8deff057ad4" containerID="42c47fae93cd7bdf847b60cdddfadace0effe50b73b6c900effb9b28fd3d3b63" exitCode=0 Apr 16 16:24:39.995579 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:39.995228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerDied","Data":"42c47fae93cd7bdf847b60cdddfadace0effe50b73b6c900effb9b28fd3d3b63"} Apr 16 16:24:40.850540 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:40.850504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:40.850697 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:40.850611 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:40.850697 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:40.850687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:40.850817 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:40.850801 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:40.999536 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:40.999456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wflp7" event={"ID":"7f749d5c-04b9-4ecc-8853-c8deff057ad4","Type":"ContainerStarted","Data":"069499f0dd163b83e3d0fbfae6c9a6a5b8258e13f365fe8c657d0ffa9e8aa642"} Apr 16 16:24:41.028277 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:41.028228 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wflp7" podStartSLOduration=6.714996431 podStartE2EDuration="37.02821473s" podCreationTimestamp="2026-04-16 16:24:04 +0000 UTC" firstStartedPulling="2026-04-16 16:24:07.478571269 +0000 UTC m=+3.201115637" lastFinishedPulling="2026-04-16 16:24:37.791789564 +0000 UTC m=+33.514333936" observedRunningTime="2026-04-16 16:24:41.026721684 +0000 UTC m=+36.749266073" watchObservedRunningTime="2026-04-16 16:24:41.02821473 +0000 UTC m=+36.750759120" Apr 16 16:24:42.850075 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:42.850044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:42.850427 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:42.850156 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:42.850427 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:42.850215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:42.850427 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:42.850319 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:44.850641 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:44.850611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:44.851013 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:44.850701 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:44.851013 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:44.850793 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:44.851013 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:44.850902 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:46.849546 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:46.849509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:46.849949 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:46.849625 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:46.849949 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:46.849690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:46.849949 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:46.849792 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:48.849966 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:48.849936 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:48.850400 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:48.850035 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:48.850400 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:48.850092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:48.850400 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:48.850167 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:50.849640 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:50.849606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:50.850115 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:50.849612 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:50.850115 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:50.849712 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n26xz" podUID="eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e" Apr 16 16:24:50.850115 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:24:50.849828 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stfn4" podUID="0b0c36b6-3279-4629-991c-70026ff0d0b6" Apr 16 16:24:51.576387 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.576356 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-64.ec2.internal" event="NodeReady" Apr 16 16:24:51.576550 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.576485 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:24:51.694799 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.694764 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7ngj9"] Apr 16 16:24:51.741783 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.741754 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7ngj9"] Apr 16 16:24:51.741953 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.741813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.745214 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.745198 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:24:51.745297 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.745281 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-psw85\"" Apr 16 16:24:51.745897 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.745871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:24:51.775834 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.775804 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9gp2s"] Apr 16 16:24:51.793501 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.793474 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c2j7c"] Apr 16 16:24:51.793651 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.793607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:51.798213 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.798194 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:24:51.798751 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.798720 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:24:51.798855 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.798798 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:24:51.799039 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.798732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hp8vh\"" Apr 16 16:24:51.812013 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.811992 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-ht495"] Apr 16 16:24:51.812153 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.812137 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:51.815990 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.815968 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:24:51.815990 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.815976 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:24:51.816165 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.815977 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:24:51.816165 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.816130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:24:51.816269 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.816203 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dpxrk\"" Apr 16 16:24:51.839348 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.839283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c2j7c"] Apr 16 16:24:51.839348 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.839307 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9gp2s"] Apr 16 16:24:51.839348 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.839318 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-ht495"] Apr 16 16:24:51.839563 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.839430 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:24:51.841682 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.841668 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:24:51.841776 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.841690 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-c7rjz\"" Apr 16 16:24:51.842019 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.842005 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:24:51.873890 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.873861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc46429e-6d00-4382-9497-f38ad024a4a7-tmp-dir\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.874250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.873896 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prq27\" (UniqueName: \"kubernetes.io/projected/fc46429e-6d00-4382-9497-f38ad024a4a7-kube-api-access-prq27\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.874250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.873955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc46429e-6d00-4382-9497-f38ad024a4a7-config-volume\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.874250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.873976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc46429e-6d00-4382-9497-f38ad024a4a7-metrics-tls\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.974562 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/024ed9a3-afbd-4d04-8b6a-d9546f86f606-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:51.974562 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974561 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/024ed9a3-afbd-4d04-8b6a-d9546f86f606-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69vx\" (UniqueName: \"kubernetes.io/projected/024ed9a3-afbd-4d04-8b6a-d9546f86f606-kube-api-access-w69vx\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc46429e-6d00-4382-9497-f38ad024a4a7-tmp-dir\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfp5j\" (UniqueName: \"kubernetes.io/projected/774f54d2-48ef-4bb5-a2eb-225ce1d17845-kube-api-access-pfp5j\") pod \"downloads-586b57c7b4-ht495\" (UID: \"774f54d2-48ef-4bb5-a2eb-225ce1d17845\") " pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3221ff3b-b85e-4d3b-9c0f-30052ddc6800-cert\") pod \"ingress-canary-9gp2s\" (UID: \"3221ff3b-b85e-4d3b-9c0f-30052ddc6800\") " pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prq27\" (UniqueName: \"kubernetes.io/projected/fc46429e-6d00-4382-9497-f38ad024a4a7-kube-api-access-prq27\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/024ed9a3-afbd-4d04-8b6a-d9546f86f606-crio-socket\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:51.974816 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zch5x\" (UniqueName: \"kubernetes.io/projected/3221ff3b-b85e-4d3b-9c0f-30052ddc6800-kube-api-access-zch5x\") pod \"ingress-canary-9gp2s\" (UID: \"3221ff3b-b85e-4d3b-9c0f-30052ddc6800\") " pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:51.975121 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc46429e-6d00-4382-9497-f38ad024a4a7-config-volume\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.975121 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc46429e-6d00-4382-9497-f38ad024a4a7-metrics-tls\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.975121 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974867 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc46429e-6d00-4382-9497-f38ad024a4a7-tmp-dir\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.975121 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.974951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/024ed9a3-afbd-4d04-8b6a-d9546f86f606-data-volume\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:51.975372 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.975351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc46429e-6d00-4382-9497-f38ad024a4a7-config-volume\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.978949 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.978910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc46429e-6d00-4382-9497-f38ad024a4a7-metrics-tls\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:51.989628 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:51.989609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prq27\" (UniqueName: \"kubernetes.io/projected/fc46429e-6d00-4382-9497-f38ad024a4a7-kube-api-access-prq27\") pod \"dns-default-7ngj9\" (UID: \"fc46429e-6d00-4382-9497-f38ad024a4a7\") " pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:52.050892 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.050851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:52.075668 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/024ed9a3-afbd-4d04-8b6a-d9546f86f606-crio-socket\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.075832 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zch5x\" (UniqueName: \"kubernetes.io/projected/3221ff3b-b85e-4d3b-9c0f-30052ddc6800-kube-api-access-zch5x\") pod \"ingress-canary-9gp2s\" (UID: \"3221ff3b-b85e-4d3b-9c0f-30052ddc6800\") " pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:52.075832 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/024ed9a3-afbd-4d04-8b6a-d9546f86f606-crio-socket\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.075964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/024ed9a3-afbd-4d04-8b6a-d9546f86f606-data-volume\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.075964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/024ed9a3-afbd-4d04-8b6a-d9546f86f606-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.075964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/024ed9a3-afbd-4d04-8b6a-d9546f86f606-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.075964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w69vx\" (UniqueName: \"kubernetes.io/projected/024ed9a3-afbd-4d04-8b6a-d9546f86f606-kube-api-access-w69vx\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.075964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfp5j\" (UniqueName: \"kubernetes.io/projected/774f54d2-48ef-4bb5-a2eb-225ce1d17845-kube-api-access-pfp5j\") pod \"downloads-586b57c7b4-ht495\" (UID: \"774f54d2-48ef-4bb5-a2eb-225ce1d17845\") " pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:24:52.076196 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.075982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3221ff3b-b85e-4d3b-9c0f-30052ddc6800-cert\") pod \"ingress-canary-9gp2s\" (UID: \"3221ff3b-b85e-4d3b-9c0f-30052ddc6800\") " pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:52.078401 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.078378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3221ff3b-b85e-4d3b-9c0f-30052ddc6800-cert\") pod \"ingress-canary-9gp2s\" (UID: \"3221ff3b-b85e-4d3b-9c0f-30052ddc6800\") " pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:52.080159 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.080131 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/024ed9a3-afbd-4d04-8b6a-d9546f86f606-data-volume\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.080462 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.080440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/024ed9a3-afbd-4d04-8b6a-d9546f86f606-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.082102 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.082079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/024ed9a3-afbd-4d04-8b6a-d9546f86f606-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.110035 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.109958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfp5j\" (UniqueName: \"kubernetes.io/projected/774f54d2-48ef-4bb5-a2eb-225ce1d17845-kube-api-access-pfp5j\") pod \"downloads-586b57c7b4-ht495\" (UID: \"774f54d2-48ef-4bb5-a2eb-225ce1d17845\") " pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:24:52.111273 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.111253 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69vx\" (UniqueName: \"kubernetes.io/projected/024ed9a3-afbd-4d04-8b6a-d9546f86f606-kube-api-access-w69vx\") pod \"insights-runtime-extractor-c2j7c\" (UID: \"024ed9a3-afbd-4d04-8b6a-d9546f86f606\") " pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.118944 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.118910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zch5x\" (UniqueName: \"kubernetes.io/projected/3221ff3b-b85e-4d3b-9c0f-30052ddc6800-kube-api-access-zch5x\") pod \"ingress-canary-9gp2s\" (UID: \"3221ff3b-b85e-4d3b-9c0f-30052ddc6800\") " pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:52.119660 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.119642 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c2j7c" Apr 16 16:24:52.148090 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.148050 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:24:52.288945 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.288876 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c2j7c"] Apr 16 16:24:52.290472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.290445 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7ngj9"] Apr 16 16:24:52.293961 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:52.293932 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc46429e_6d00_4382_9497_f38ad024a4a7.slice/crio-0ef33e4dd5ba4ffd73f879d07e0bd7ce8b90c4702089fc32afd28c08db36196b WatchSource:0}: Error finding container 0ef33e4dd5ba4ffd73f879d07e0bd7ce8b90c4702089fc32afd28c08db36196b: Status 404 returned error can't find the container with id 0ef33e4dd5ba4ffd73f879d07e0bd7ce8b90c4702089fc32afd28c08db36196b Apr 16 16:24:52.294690 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:52.294669 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024ed9a3_afbd_4d04_8b6a_d9546f86f606.slice/crio-8dbc873a4beae5292617c6134b2c343c2d5585966ef805e9c17d66bbaba1b6de WatchSource:0}: Error finding container 8dbc873a4beae5292617c6134b2c343c2d5585966ef805e9c17d66bbaba1b6de: Status 404 returned error can't find the container with id 8dbc873a4beae5292617c6134b2c343c2d5585966ef805e9c17d66bbaba1b6de Apr 16 16:24:52.301863 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.301841 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-ht495"] Apr 16 16:24:52.304599 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:52.304573 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774f54d2_48ef_4bb5_a2eb_225ce1d17845.slice/crio-4fffa442b5fe02f55a678bdf357fa6d3a2d9c6efa235868423904cda01186284 WatchSource:0}: Error finding container 4fffa442b5fe02f55a678bdf357fa6d3a2d9c6efa235868423904cda01186284: Status 404 returned error can't find the container with id 4fffa442b5fe02f55a678bdf357fa6d3a2d9c6efa235868423904cda01186284 Apr 16 16:24:52.405721 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.405690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9gp2s" Apr 16 16:24:52.521478 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.521438 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9gp2s"] Apr 16 16:24:52.525685 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:24:52.525656 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3221ff3b_b85e_4d3b_9c0f_30052ddc6800.slice/crio-b71ad1bf0de42ff3f2142cc21c06312b70401a4c650a9ea1b8f15487d911a29a WatchSource:0}: Error finding container b71ad1bf0de42ff3f2142cc21c06312b70401a4c650a9ea1b8f15487d911a29a: Status 404 returned error can't find the container with id b71ad1bf0de42ff3f2142cc21c06312b70401a4c650a9ea1b8f15487d911a29a Apr 16 16:24:52.852716 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.852683 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:24:52.852940 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.852905 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:24:52.857037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.856830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:24:52.858277 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.858010 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w5tzs\"" Apr 16 16:24:52.858277 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.858083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:24:52.858427 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.858324 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:24:52.858427 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:52.858373 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f6w5w\"" Apr 16 16:24:53.021080 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:53.021030 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ngj9" event={"ID":"fc46429e-6d00-4382-9497-f38ad024a4a7","Type":"ContainerStarted","Data":"0ef33e4dd5ba4ffd73f879d07e0bd7ce8b90c4702089fc32afd28c08db36196b"} Apr 16 16:24:53.022327 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:53.022295 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9gp2s" event={"ID":"3221ff3b-b85e-4d3b-9c0f-30052ddc6800","Type":"ContainerStarted","Data":"b71ad1bf0de42ff3f2142cc21c06312b70401a4c650a9ea1b8f15487d911a29a"} Apr 16 16:24:53.023913 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:53.023867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2j7c" event={"ID":"024ed9a3-afbd-4d04-8b6a-d9546f86f606","Type":"ContainerStarted","Data":"6e40526e162691d8804da63d8bfedf897cbe92c3361b96ef2ec897523c992ad2"} Apr 16 16:24:53.024032 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:53.023950 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2j7c" event={"ID":"024ed9a3-afbd-4d04-8b6a-d9546f86f606","Type":"ContainerStarted","Data":"8dbc873a4beae5292617c6134b2c343c2d5585966ef805e9c17d66bbaba1b6de"} Apr 16 16:24:53.024999 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:53.024979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-ht495" event={"ID":"774f54d2-48ef-4bb5-a2eb-225ce1d17845","Type":"ContainerStarted","Data":"4fffa442b5fe02f55a678bdf357fa6d3a2d9c6efa235868423904cda01186284"} Apr 16 16:24:55.034415 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:55.032132 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9gp2s" event={"ID":"3221ff3b-b85e-4d3b-9c0f-30052ddc6800","Type":"ContainerStarted","Data":"47279162a75d449783320d32a5c4f7ac9de6a8d5a6d9faf9657ac90742221eb4"} Apr 16 16:24:55.035771 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:55.035737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2j7c" event={"ID":"024ed9a3-afbd-4d04-8b6a-d9546f86f606","Type":"ContainerStarted","Data":"123990b90e5e246fdaa3b98830c2e778da1e516610cce8525f8e0a321266a7e7"} Apr 16 16:24:55.037762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:55.037729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ngj9" event={"ID":"fc46429e-6d00-4382-9497-f38ad024a4a7","Type":"ContainerStarted","Data":"a2da9df63ff964ec7332e8e399d6ee6a40195fd8ea81fbea7a0e388c9bd0dc9f"} Apr 16 16:24:56.042690 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:56.042649 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ngj9" event={"ID":"fc46429e-6d00-4382-9497-f38ad024a4a7","Type":"ContainerStarted","Data":"be79f3c13a4cc7eb5398b5a55c1ecc7c88e449a10ebd609982edb3622d5471d8"} Apr 16 16:24:56.070806 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:56.070718 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9gp2s" podStartSLOduration=2.753848275 podStartE2EDuration="5.070670157s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.527469204 +0000 UTC m=+48.250013571" lastFinishedPulling="2026-04-16 16:24:54.844291072 +0000 UTC m=+50.566835453" observedRunningTime="2026-04-16 16:24:55.068825468 +0000 UTC m=+50.791369854" watchObservedRunningTime="2026-04-16 16:24:56.070670157 +0000 UTC m=+51.793214547" Apr 16 16:24:56.071609 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:56.071567 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7ngj9" podStartSLOduration=2.567042655 podStartE2EDuration="5.071552652s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.295980959 +0000 UTC m=+48.018525326" lastFinishedPulling="2026-04-16 16:24:54.800490952 +0000 UTC m=+50.523035323" observedRunningTime="2026-04-16 16:24:56.071173389 +0000 UTC m=+51.793717780" watchObservedRunningTime="2026-04-16 16:24:56.071552652 +0000 UTC m=+51.794097044" Apr 16 16:24:57.047491 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:57.047455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c2j7c" event={"ID":"024ed9a3-afbd-4d04-8b6a-d9546f86f606","Type":"ContainerStarted","Data":"d48ffdab242da67ca8706ed06a4ddc56d64ab8fd316aba2811c714421abf0e6a"} Apr 16 16:24:57.047992 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:57.047796 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7ngj9" Apr 16 16:24:57.074130 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:24:57.074078 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c2j7c" podStartSLOduration=2.467175188 podStartE2EDuration="6.074023488s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.441425785 +0000 UTC m=+48.163970153" lastFinishedPulling="2026-04-16 16:24:56.048274071 +0000 UTC m=+51.770818453" observedRunningTime="2026-04-16 16:24:57.071837794 +0000 UTC m=+52.794382183" watchObservedRunningTime="2026-04-16 16:24:57.074023488 +0000 UTC m=+52.796567879" Apr 16 16:25:00.786380 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.786343 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54559f49f8-wf69c"] Apr 16 16:25:00.808466 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.808441 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54559f49f8-wf69c"] Apr 16 16:25:00.808632 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.808560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.812273 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.812248 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:25:00.813456 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.813313 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:25:00.813456 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.813332 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:25:00.813456 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.813323 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cpqfd\"" Apr 16 16:25:00.813456 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.813371 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:25:00.814465 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.814448 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:25:00.849785 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.849748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-console-config\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.849785 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.849782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-oauth-config\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.849996 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.849806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-service-ca\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.849996 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.849831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75nc9\" (UniqueName: \"kubernetes.io/projected/92c39430-d955-4843-8da1-226b6e806017-kube-api-access-75nc9\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.849996 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.849867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-serving-cert\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.849996 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.849913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-oauth-serving-cert\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.951239 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.951210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-console-config\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.951239 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.951244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-oauth-config\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.951477 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.951271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-service-ca\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.951477 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.951291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75nc9\" (UniqueName: \"kubernetes.io/projected/92c39430-d955-4843-8da1-226b6e806017-kube-api-access-75nc9\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.951477 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.951331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-serving-cert\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.951477 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.951364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-oauth-serving-cert\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.952224 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.952157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-console-config\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.952224 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.952173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-service-ca\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.952224 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.952202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-oauth-serving-cert\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.954139 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.954114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-oauth-config\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.954242 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.954223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-serving-cert\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:00.969105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:00.969081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75nc9\" (UniqueName: \"kubernetes.io/projected/92c39430-d955-4843-8da1-226b6e806017-kube-api-access-75nc9\") pod \"console-54559f49f8-wf69c\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:01.118790 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:01.118707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:01.258467 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:01.258441 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54559f49f8-wf69c"] Apr 16 16:25:01.263278 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:01.263249 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c39430_d955_4843_8da1_226b6e806017.slice/crio-9525d3954937108178beea9953435a3db320dc1995c86ec2b0766c45a37cbf3b WatchSource:0}: Error finding container 9525d3954937108178beea9953435a3db320dc1995c86ec2b0766c45a37cbf3b: Status 404 returned error can't find the container with id 9525d3954937108178beea9953435a3db320dc1995c86ec2b0766c45a37cbf3b Apr 16 16:25:02.062289 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:02.062254 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559f49f8-wf69c" event={"ID":"92c39430-d955-4843-8da1-226b6e806017","Type":"ContainerStarted","Data":"9525d3954937108178beea9953435a3db320dc1995c86ec2b0766c45a37cbf3b"} Apr 16 16:25:02.985165 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:02.985134 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dj5w9" Apr 16 16:25:04.053756 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:04.053725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7ngj9" Apr 16 16:25:07.855957 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.855908 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-nkz7c"] Apr 16 16:25:07.867708 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.867684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:07.872128 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.872099 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:25:07.872385 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.872333 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 16:25:07.873301 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.873279 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:25:07.873594 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.873573 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9j2vv\"" Apr 16 16:25:07.873967 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.873948 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 16:25:07.874221 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.874194 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:25:07.876277 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.876239 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-nkz7c"] Apr 16 16:25:07.900726 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.900694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:07.900879 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.900760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:07.900879 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.900779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krvnc\" (UniqueName: \"kubernetes.io/projected/91db00c7-daa7-456d-93c4-bda81def2d2d-kube-api-access-krvnc\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:07.900879 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:07.900808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91db00c7-daa7-456d-93c4-bda81def2d2d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.002115 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.002080 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91db00c7-daa7-456d-93c4-bda81def2d2d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.002296 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.002129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.002296 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.002204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.002296 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.002234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krvnc\" (UniqueName: \"kubernetes.io/projected/91db00c7-daa7-456d-93c4-bda81def2d2d-kube-api-access-krvnc\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.002460 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:25:08.002322 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 16:25:08.002460 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:25:08.002399 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls podName:91db00c7-daa7-456d-93c4-bda81def2d2d nodeName:}" failed. No retries permitted until 2026-04-16 16:25:08.502381337 +0000 UTC m=+64.224925724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls") pod "prometheus-operator-78f957474d-nkz7c" (UID: "91db00c7-daa7-456d-93c4-bda81def2d2d") : secret "prometheus-operator-tls" not found Apr 16 16:25:08.003369 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.003347 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91db00c7-daa7-456d-93c4-bda81def2d2d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.004777 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.004754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.013732 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.013714 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krvnc\" (UniqueName: \"kubernetes.io/projected/91db00c7-daa7-456d-93c4-bda81def2d2d-kube-api-access-krvnc\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.507237 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.507198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:08.507396 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:25:08.507348 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 16:25:08.507439 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:25:08.507407 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls podName:91db00c7-daa7-456d-93c4-bda81def2d2d nodeName:}" failed. No retries permitted until 2026-04-16 16:25:09.507392212 +0000 UTC m=+65.229936604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls") pod "prometheus-operator-78f957474d-nkz7c" (UID: "91db00c7-daa7-456d-93c4-bda81def2d2d") : secret "prometheus-operator-tls" not found Apr 16 16:25:08.962654 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.962586 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-567c4bcc7b-9h4hq"] Apr 16 16:25:08.985081 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.985056 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-567c4bcc7b-9h4hq"] Apr 16 16:25:08.985225 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.985165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:08.995344 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:08.995316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:25:09.011076 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-config\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.011200 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-oauth-config\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.011200 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011155 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72658\" (UniqueName: \"kubernetes.io/projected/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-kube-api-access-72658\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.011288 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-service-ca\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.011288 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-trusted-ca-bundle\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.011362 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-oauth-serving-cert\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.011406 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.011360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-serving-cert\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.082446 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.082410 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-ht495" event={"ID":"774f54d2-48ef-4bb5-a2eb-225ce1d17845","Type":"ContainerStarted","Data":"51aa54c050a8c900e0107a53e5217147aa22f2ebcde1fe5c1417b3f8d3ffa18c"} Apr 16 16:25:09.082651 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.082630 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:25:09.084268 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.084242 2578 patch_prober.go:28] interesting pod/downloads-586b57c7b4-ht495 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.8:8080/\": dial tcp 10.134.0.8:8080: connect: connection refused" start-of-body= Apr 16 16:25:09.084399 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.084291 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-ht495" podUID="774f54d2-48ef-4bb5-a2eb-225ce1d17845" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.8:8080/\": dial tcp 10.134.0.8:8080: connect: connection refused" Apr 16 16:25:09.104766 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.104725 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-ht495" podStartSLOduration=1.7481727 podStartE2EDuration="18.104693209s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.306288275 +0000 UTC m=+48.028832644" lastFinishedPulling="2026-04-16 16:25:08.662808786 +0000 UTC m=+64.385353153" observedRunningTime="2026-04-16 16:25:09.103433709 +0000 UTC m=+64.825978100" watchObservedRunningTime="2026-04-16 16:25:09.104693209 +0000 UTC m=+64.827237631" Apr 16 16:25:09.112291 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-oauth-config\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.112413 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72658\" (UniqueName: \"kubernetes.io/projected/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-kube-api-access-72658\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.112513 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-service-ca\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.112573 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-trusted-ca-bundle\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.112622 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-oauth-serving-cert\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.112685 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-serving-cert\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.112738 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.112712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-config\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.113314 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.113288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-oauth-serving-cert\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.113439 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.113288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-service-ca\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.113439 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.113425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-config\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.113569 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.113543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-trusted-ca-bundle\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.114794 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.114773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-oauth-config\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.115307 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.115288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-serving-cert\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.125796 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.125764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72658\" (UniqueName: \"kubernetes.io/projected/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-kube-api-access-72658\") pod \"console-567c4bcc7b-9h4hq\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.297450 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.297410 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:09.441360 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.441326 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-567c4bcc7b-9h4hq"] Apr 16 16:25:09.445358 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:09.445329 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bdb172_fc8d_4b50_b31d_7e254b8df08a.slice/crio-bf305e5130bda27b493f87d2348ae9741f58f049e095ffd45e65db5a7e993943 WatchSource:0}: Error finding container bf305e5130bda27b493f87d2348ae9741f58f049e095ffd45e65db5a7e993943: Status 404 returned error can't find the container with id bf305e5130bda27b493f87d2348ae9741f58f049e095ffd45e65db5a7e993943 Apr 16 16:25:09.516659 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.516622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:09.519777 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.519734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/91db00c7-daa7-456d-93c4-bda81def2d2d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-nkz7c\" (UID: \"91db00c7-daa7-456d-93c4-bda81def2d2d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:09.679135 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.678906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" Apr 16 16:25:09.826748 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:09.826712 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-nkz7c"] Apr 16 16:25:09.831552 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:09.831507 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91db00c7_daa7_456d_93c4_bda81def2d2d.slice/crio-3e41caee8b0e02505f774e72e69c2de455227b8bb0fe207e9e6bd047e10da56a WatchSource:0}: Error finding container 3e41caee8b0e02505f774e72e69c2de455227b8bb0fe207e9e6bd047e10da56a: Status 404 returned error can't find the container with id 3e41caee8b0e02505f774e72e69c2de455227b8bb0fe207e9e6bd047e10da56a Apr 16 16:25:10.087482 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.087390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567c4bcc7b-9h4hq" event={"ID":"c6bdb172-fc8d-4b50-b31d-7e254b8df08a","Type":"ContainerStarted","Data":"bf305e5130bda27b493f87d2348ae9741f58f049e095ffd45e65db5a7e993943"} Apr 16 16:25:10.089934 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.089731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" event={"ID":"91db00c7-daa7-456d-93c4-bda81def2d2d","Type":"ContainerStarted","Data":"3e41caee8b0e02505f774e72e69c2de455227b8bb0fe207e9e6bd047e10da56a"} Apr 16 16:25:10.102730 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.102703 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-ht495" Apr 16 16:25:10.627528 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.627491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:25:10.627712 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.627547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:25:10.630414 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.630342 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:25:10.630636 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.630346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:25:10.641041 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.641016 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:25:10.642027 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.641985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0c36b6-3279-4629-991c-70026ff0d0b6-metrics-certs\") pod \"network-metrics-daemon-stfn4\" (UID: \"0b0c36b6-3279-4629-991c-70026ff0d0b6\") " pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:25:10.653464 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.653407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vnx\" (UniqueName: \"kubernetes.io/projected/eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e-kube-api-access-c9vnx\") pod \"network-check-target-n26xz\" (UID: \"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e\") " pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:25:10.867504 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.867482 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f6w5w\"" Apr 16 16:25:10.875901 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.875869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stfn4" Apr 16 16:25:10.876563 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.876537 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w5tzs\"" Apr 16 16:25:10.882306 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:10.882282 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:25:13.297526 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:13.297383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n26xz"] Apr 16 16:25:13.301364 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:13.301333 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb0bd2c_94b2_4f4f_a32a_8624aa8a7d0e.slice/crio-d5e40875e1317c35fe4ede6f535d729dd26a54f360b4150153cbec6ae29aecf2 WatchSource:0}: Error finding container d5e40875e1317c35fe4ede6f535d729dd26a54f360b4150153cbec6ae29aecf2: Status 404 returned error can't find the container with id d5e40875e1317c35fe4ede6f535d729dd26a54f360b4150153cbec6ae29aecf2 Apr 16 16:25:13.324970 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:13.324912 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-stfn4"] Apr 16 16:25:13.329758 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:13.329728 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b0c36b6_3279_4629_991c_70026ff0d0b6.slice/crio-72a200dd0c98fe938b074bfeeef93fae00acbe3d8169011a46efa882fa1a8298 WatchSource:0}: Error finding container 72a200dd0c98fe938b074bfeeef93fae00acbe3d8169011a46efa882fa1a8298: Status 404 returned error can't find the container with id 72a200dd0c98fe938b074bfeeef93fae00acbe3d8169011a46efa882fa1a8298 Apr 16 16:25:14.106530 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.106475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n26xz" event={"ID":"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e","Type":"ContainerStarted","Data":"d5e40875e1317c35fe4ede6f535d729dd26a54f360b4150153cbec6ae29aecf2"} Apr 16 16:25:14.109815 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.109785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" event={"ID":"91db00c7-daa7-456d-93c4-bda81def2d2d","Type":"ContainerStarted","Data":"b0ca8013a37f8e82649613aac6ceadd3f5ec3aed237238fca1475418fcf91e51"} Apr 16 16:25:14.110023 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.109819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" event={"ID":"91db00c7-daa7-456d-93c4-bda81def2d2d","Type":"ContainerStarted","Data":"74abd190164b2f25198f3b442f486b2fc820f59b8c661864087d3a36ecb44899"} Apr 16 16:25:14.112102 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.112060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567c4bcc7b-9h4hq" event={"ID":"c6bdb172-fc8d-4b50-b31d-7e254b8df08a","Type":"ContainerStarted","Data":"326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052"} Apr 16 16:25:14.113680 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.113652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-stfn4" event={"ID":"0b0c36b6-3279-4629-991c-70026ff0d0b6","Type":"ContainerStarted","Data":"72a200dd0c98fe938b074bfeeef93fae00acbe3d8169011a46efa882fa1a8298"} Apr 16 16:25:14.115839 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.115816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559f49f8-wf69c" event={"ID":"92c39430-d955-4843-8da1-226b6e806017","Type":"ContainerStarted","Data":"b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3"} Apr 16 16:25:14.137762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.137539 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-nkz7c" podStartSLOduration=3.8062100450000003 podStartE2EDuration="7.137522382s" podCreationTimestamp="2026-04-16 16:25:07 +0000 UTC" firstStartedPulling="2026-04-16 16:25:09.834141217 +0000 UTC m=+65.556685585" lastFinishedPulling="2026-04-16 16:25:13.165453544 +0000 UTC m=+68.887997922" observedRunningTime="2026-04-16 16:25:14.136597333 +0000 UTC m=+69.859141728" watchObservedRunningTime="2026-04-16 16:25:14.137522382 +0000 UTC m=+69.860066775" Apr 16 16:25:14.174582 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.174506 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-567c4bcc7b-9h4hq" podStartSLOduration=2.447204889 podStartE2EDuration="6.174489004s" podCreationTimestamp="2026-04-16 16:25:08 +0000 UTC" firstStartedPulling="2026-04-16 16:25:09.44769755 +0000 UTC m=+65.170241918" lastFinishedPulling="2026-04-16 16:25:13.174981651 +0000 UTC m=+68.897526033" observedRunningTime="2026-04-16 16:25:14.174441161 +0000 UTC m=+69.896985551" watchObservedRunningTime="2026-04-16 16:25:14.174489004 +0000 UTC m=+69.897033397" Apr 16 16:25:14.208384 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:14.207712 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54559f49f8-wf69c" podStartSLOduration=2.307696848 podStartE2EDuration="14.207696544s" podCreationTimestamp="2026-04-16 16:25:00 +0000 UTC" firstStartedPulling="2026-04-16 16:25:01.265452789 +0000 UTC m=+56.987997163" lastFinishedPulling="2026-04-16 16:25:13.165452475 +0000 UTC m=+68.887996859" observedRunningTime="2026-04-16 16:25:14.204268936 +0000 UTC m=+69.926813330" watchObservedRunningTime="2026-04-16 16:25:14.207696544 +0000 UTC m=+69.930240936" Apr 16 16:25:16.124811 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.124767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-stfn4" event={"ID":"0b0c36b6-3279-4629-991c-70026ff0d0b6","Type":"ContainerStarted","Data":"f4f3c8883dc982141bf4baef952036beff1705578e99ce032e53ad1ae7ce11d7"} Apr 16 16:25:16.295948 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.295650 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh"] Apr 16 16:25:16.324796 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.324759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh"] Apr 16 16:25:16.325597 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.324961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.329117 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.329070 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:25:16.330973 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.330783 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 16:25:16.331087 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.331023 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-mkscb\"" Apr 16 16:25:16.372388 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.372254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c66366bc-9d81-43f5-af40-4e9c08a337d5-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.372388 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.372316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87ltb\" (UniqueName: \"kubernetes.io/projected/c66366bc-9d81-43f5-af40-4e9c08a337d5-kube-api-access-87ltb\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.372388 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.372386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c66366bc-9d81-43f5-af40-4e9c08a337d5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.372678 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.372450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c66366bc-9d81-43f5-af40-4e9c08a337d5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.394998 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.394902 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jdflm"] Apr 16 16:25:16.414972 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.414942 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.424331 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.423873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:25:16.424331 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.423890 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:25:16.424331 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.424173 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:25:16.424331 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.424302 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vdxn6\"" Apr 16 16:25:16.443774 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.443713 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-4rntv"] Apr 16 16:25:16.464300 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.464213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.466804 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.466761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-4rntv"] Apr 16 16:25:16.469279 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.469257 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:25:16.469786 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.469767 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:25:16.470028 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.470006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:25:16.470122 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.470042 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-rg4ch\"" Apr 16 16:25:16.473010 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.472965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87ltb\" (UniqueName: \"kubernetes.io/projected/c66366bc-9d81-43f5-af40-4e9c08a337d5-kube-api-access-87ltb\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.473130 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-wtmp\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473130 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c66366bc-9d81-43f5-af40-4e9c08a337d5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.473130 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflks\" (UniqueName: \"kubernetes.io/projected/41d52522-7c10-4ddf-9497-d3fd714c18b9-kube-api-access-jflks\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-sys\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c66366bc-9d81-43f5-af40-4e9c08a337d5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-root\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-accelerators-collector-config\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-tls\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c66366bc-9d81-43f5-af40-4e9c08a337d5-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473357 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-textfile\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.473441 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.473389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41d52522-7c10-4ddf-9497-d3fd714c18b9-metrics-client-ca\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.477510 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.477483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c66366bc-9d81-43f5-af40-4e9c08a337d5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.479634 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.479604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c66366bc-9d81-43f5-af40-4e9c08a337d5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.482105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.482082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c66366bc-9d81-43f5-af40-4e9c08a337d5-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.511183 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.511131 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87ltb\" (UniqueName: \"kubernetes.io/projected/c66366bc-9d81-43f5-af40-4e9c08a337d5-kube-api-access-87ltb\") pod \"openshift-state-metrics-5669946b84-dzwlh\" (UID: \"c66366bc-9d81-43f5-af40-4e9c08a337d5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.574359 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-tls\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574379 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-textfile\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41d52522-7c10-4ddf-9497-d3fd714c18b9-metrics-client-ca\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6e8dbc9b-a73b-491e-802f-609a1250cd4b-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-wtmp\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e8dbc9b-a73b-491e-802f-609a1250cd4b-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jflks\" (UniqueName: \"kubernetes.io/projected/41d52522-7c10-4ddf-9497-d3fd714c18b9-kube-api-access-jflks\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574604 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2zf\" (UniqueName: \"kubernetes.io/projected/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-api-access-kf2zf\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.574676 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-sys\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.575323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.575323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.575323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-root\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.575323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.574790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-accelerators-collector-config\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.575323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.575086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-sys\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.575609 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.575487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-accelerators-collector-config\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.575609 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.575582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-root\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.575769 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.575743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-wtmp\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.576096 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.576052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-textfile\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.578684 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.578344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-tls\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.578684 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.578648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41d52522-7c10-4ddf-9497-d3fd714c18b9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.582224 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.582200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41d52522-7c10-4ddf-9497-d3fd714c18b9-metrics-client-ca\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.606770 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.606709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflks\" (UniqueName: \"kubernetes.io/projected/41d52522-7c10-4ddf-9497-d3fd714c18b9-kube-api-access-jflks\") pod \"node-exporter-jdflm\" (UID: \"41d52522-7c10-4ddf-9497-d3fd714c18b9\") " pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.637598 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.637560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" Apr 16 16:25:16.676138 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.676138 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.676353 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.676353 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6e8dbc9b-a73b-491e-802f-609a1250cd4b-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.676353 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e8dbc9b-a73b-491e-802f-609a1250cd4b-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.676353 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2zf\" (UniqueName: \"kubernetes.io/projected/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-api-access-kf2zf\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.676994 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.676883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.677415 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.677382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6e8dbc9b-a73b-491e-802f-609a1250cd4b-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.677492 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.677445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e8dbc9b-a73b-491e-802f-609a1250cd4b-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.679089 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.679062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.679187 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.679147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.686444 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.686413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2zf\" (UniqueName: \"kubernetes.io/projected/6e8dbc9b-a73b-491e-802f-609a1250cd4b-kube-api-access-kf2zf\") pod \"kube-state-metrics-7479c89684-4rntv\" (UID: \"6e8dbc9b-a73b-491e-802f-609a1250cd4b\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.727533 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.727495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jdflm" Apr 16 16:25:16.737896 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:16.737852 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d52522_7c10_4ddf_9497_d3fd714c18b9.slice/crio-ca31f8662c7a2eb042cabe9f32c468e2848430bfb0e4799cc90ea376f4b2294f WatchSource:0}: Error finding container ca31f8662c7a2eb042cabe9f32c468e2848430bfb0e4799cc90ea376f4b2294f: Status 404 returned error can't find the container with id ca31f8662c7a2eb042cabe9f32c468e2848430bfb0e4799cc90ea376f4b2294f Apr 16 16:25:16.783591 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.783213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" Apr 16 16:25:16.888680 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.888656 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh"] Apr 16 16:25:16.961706 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:16.961631 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-4rntv"] Apr 16 16:25:17.020698 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:17.020661 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc66366bc_9d81_43f5_af40_4e9c08a337d5.slice/crio-20ed08266cf5171ec2f32825caaa854bd11d128f773ac3232b004c5db4659145 WatchSource:0}: Error finding container 20ed08266cf5171ec2f32825caaa854bd11d128f773ac3232b004c5db4659145: Status 404 returned error can't find the container with id 20ed08266cf5171ec2f32825caaa854bd11d128f773ac3232b004c5db4659145 Apr 16 16:25:17.021331 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:17.021304 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8dbc9b_a73b_491e_802f_609a1250cd4b.slice/crio-14f1937f24c91c00828e86b72d3766a6dcd4dfad649681db48be224220d9f45a WatchSource:0}: Error finding container 14f1937f24c91c00828e86b72d3766a6dcd4dfad649681db48be224220d9f45a: Status 404 returned error can't find the container with id 14f1937f24c91c00828e86b72d3766a6dcd4dfad649681db48be224220d9f45a Apr 16 16:25:17.128751 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:17.128718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jdflm" event={"ID":"41d52522-7c10-4ddf-9497-d3fd714c18b9","Type":"ContainerStarted","Data":"ca31f8662c7a2eb042cabe9f32c468e2848430bfb0e4799cc90ea376f4b2294f"} Apr 16 16:25:17.129890 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:17.129864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" event={"ID":"c66366bc-9d81-43f5-af40-4e9c08a337d5","Type":"ContainerStarted","Data":"20ed08266cf5171ec2f32825caaa854bd11d128f773ac3232b004c5db4659145"} Apr 16 16:25:17.131772 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:17.131746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-stfn4" event={"ID":"0b0c36b6-3279-4629-991c-70026ff0d0b6","Type":"ContainerStarted","Data":"d6bb60ba8a102bac7fb1c5963f73725747385c0e7e66fc38458c1e79c70617b1"} Apr 16 16:25:17.132978 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:17.132955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" event={"ID":"6e8dbc9b-a73b-491e-802f-609a1250cd4b","Type":"ContainerStarted","Data":"14f1937f24c91c00828e86b72d3766a6dcd4dfad649681db48be224220d9f45a"} Apr 16 16:25:17.153813 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:17.153764 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-stfn4" podStartSLOduration=70.197892335 podStartE2EDuration="1m12.153745409s" podCreationTimestamp="2026-04-16 16:24:05 +0000 UTC" firstStartedPulling="2026-04-16 16:25:13.332507165 +0000 UTC m=+69.055051534" lastFinishedPulling="2026-04-16 16:25:15.288360233 +0000 UTC m=+71.010904608" observedRunningTime="2026-04-16 16:25:17.151605636 +0000 UTC m=+72.874150026" watchObservedRunningTime="2026-04-16 16:25:17.153745409 +0000 UTC m=+72.876289802" Apr 16 16:25:18.138766 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:18.138732 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n26xz" event={"ID":"eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e","Type":"ContainerStarted","Data":"846b53943c5932d63c9a6f3c1d6b4263ad96eda35b955d6c5fe5df51ff453e46"} Apr 16 16:25:18.139223 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:18.139161 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:25:18.142141 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:18.142112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" event={"ID":"c66366bc-9d81-43f5-af40-4e9c08a337d5","Type":"ContainerStarted","Data":"65698a138252e60fdcf31b802c10da2e01c70c47ecce8b9c7102f174feed81ab"} Apr 16 16:25:18.142269 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:18.142149 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" event={"ID":"c66366bc-9d81-43f5-af40-4e9c08a337d5","Type":"ContainerStarted","Data":"04ceb3a3522e2dbc838ba4723b6a47943810ef6197719c75b77e877e5e6364a9"} Apr 16 16:25:18.170347 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:18.169125 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n26xz" podStartSLOduration=70.420710692 podStartE2EDuration="1m14.169107883s" podCreationTimestamp="2026-04-16 16:24:04 +0000 UTC" firstStartedPulling="2026-04-16 16:25:13.30558521 +0000 UTC m=+69.028129596" lastFinishedPulling="2026-04-16 16:25:17.053982414 +0000 UTC m=+72.776526787" observedRunningTime="2026-04-16 16:25:18.167651542 +0000 UTC m=+73.890195933" watchObservedRunningTime="2026-04-16 16:25:18.169107883 +0000 UTC m=+73.891652286" Apr 16 16:25:19.146800 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:19.146765 2578 generic.go:358] "Generic (PLEG): container finished" podID="41d52522-7c10-4ddf-9497-d3fd714c18b9" containerID="add47dfa6a844807723155b1bb56fa4bf52da91a7f6422c681f5cf868e10bb7e" exitCode=0 Apr 16 16:25:19.147295 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:19.146857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jdflm" event={"ID":"41d52522-7c10-4ddf-9497-d3fd714c18b9","Type":"ContainerDied","Data":"add47dfa6a844807723155b1bb56fa4bf52da91a7f6422c681f5cf868e10bb7e"} Apr 16 16:25:19.298027 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:19.297993 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:19.298196 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:19.298037 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:19.303435 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:19.303407 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:20.152498 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.152406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" event={"ID":"6e8dbc9b-a73b-491e-802f-609a1250cd4b","Type":"ContainerStarted","Data":"0df25f2ba927c7e6c08caabae5c71d32118d2e0176b1dc180cdc09b9f5ed77c7"} Apr 16 16:25:20.152498 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.152441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" event={"ID":"6e8dbc9b-a73b-491e-802f-609a1250cd4b","Type":"ContainerStarted","Data":"dae4f600c2cd4195096886de78e6d8830e5d3e03ce63ea099050bc1d1ad4ccd0"} Apr 16 16:25:20.152498 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.152454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" event={"ID":"6e8dbc9b-a73b-491e-802f-609a1250cd4b","Type":"ContainerStarted","Data":"67e9651e1409b3edf7dfa2d5bff4f70168fb12b66bb34e1678cf49b51d26324a"} Apr 16 16:25:20.154421 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.154399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jdflm" event={"ID":"41d52522-7c10-4ddf-9497-d3fd714c18b9","Type":"ContainerStarted","Data":"8e4f41994e19cda15f16a08340173c186f9469d180786ee82fd022d3296890bc"} Apr 16 16:25:20.154531 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.154425 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jdflm" event={"ID":"41d52522-7c10-4ddf-9497-d3fd714c18b9","Type":"ContainerStarted","Data":"4ab004e2d1784ce8af99c54695b36bc0755ed8f3bcbbdec6af645e653695f9d8"} Apr 16 16:25:20.156313 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.156291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" event={"ID":"c66366bc-9d81-43f5-af40-4e9c08a337d5","Type":"ContainerStarted","Data":"55c4b80045ddfd81635c2e0bec9d3f276492dce2d7c1c2bc92436a43372d4029"} Apr 16 16:25:20.160529 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.160508 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:25:20.175291 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.175244 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-4rntv" podStartSLOduration=1.671906511 podStartE2EDuration="4.175228504s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:17.042110092 +0000 UTC m=+72.764654470" lastFinishedPulling="2026-04-16 16:25:19.54543208 +0000 UTC m=+75.267976463" observedRunningTime="2026-04-16 16:25:20.175155788 +0000 UTC m=+75.897700179" watchObservedRunningTime="2026-04-16 16:25:20.175228504 +0000 UTC m=+75.897772896" Apr 16 16:25:20.195496 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.195452 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jdflm" podStartSLOduration=2.897398527 podStartE2EDuration="4.195438669s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:16.741073532 +0000 UTC m=+72.463617912" lastFinishedPulling="2026-04-16 16:25:18.039113674 +0000 UTC m=+73.761658054" observedRunningTime="2026-04-16 16:25:20.194797355 +0000 UTC m=+75.917341745" watchObservedRunningTime="2026-04-16 16:25:20.195438669 +0000 UTC m=+75.917983060" Apr 16 16:25:20.241035 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.240990 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-dzwlh" podStartSLOduration=1.981704765 podStartE2EDuration="4.240973512s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:17.282695756 +0000 UTC m=+73.005240138" lastFinishedPulling="2026-04-16 16:25:19.541964518 +0000 UTC m=+75.264508885" observedRunningTime="2026-04-16 16:25:20.240332782 +0000 UTC m=+75.962877169" watchObservedRunningTime="2026-04-16 16:25:20.240973512 +0000 UTC m=+75.963517901" Apr 16 16:25:20.263423 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:20.263394 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54559f49f8-wf69c"] Apr 16 16:25:21.119603 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:21.119567 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:22.611097 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.611062 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fc48c8d8c-gw2pk"] Apr 16 16:25:22.628529 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.628490 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.640504 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.640471 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fc48c8d8c-gw2pk"] Apr 16 16:25:22.668564 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.668526 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:25:22.688161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.687651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.692471 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.692445 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:25:22.692782 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.692758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:25:22.692885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.692800 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:25:22.693902 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.693657 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:25:22.693902 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.693740 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:25:22.693902 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.693763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:25:22.694196 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.694125 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:25:22.694196 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.694175 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:25:22.694598 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.694346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:25:22.694598 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.694355 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2dkwr\"" Apr 16 16:25:22.694598 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.694559 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:25:22.694814 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.694774 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:25:22.696292 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.696272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e6v0657epghga\"" Apr 16 16:25:22.699437 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.699420 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:25:22.699759 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.699740 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:25:22.702619 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.702598 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-serving-cert\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-trusted-ca-bundle\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-oauth-serving-cert\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbbxs\" (UniqueName: \"kubernetes.io/projected/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-kube-api-access-rbbxs\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-oauth-config\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-service-ca\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.728747 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.728745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-config\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.829612 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829577 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-oauth-serving-cert\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.829612 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-config\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-web-config\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829695 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbbxs\" (UniqueName: \"kubernetes.io/projected/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-kube-api-access-rbbxs\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.829866 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97mf2\" (UniqueName: \"kubernetes.io/projected/884d6486-6a24-4277-850c-a3725856c08b-kube-api-access-97mf2\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-oauth-config\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.829980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-service-ca\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/884d6486-6a24-4277-850c-a3725856c08b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-config\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/884d6486-6a24-4277-850c-a3725856c08b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-oauth-serving-cert\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/884d6486-6a24-4277-850c-a3725856c08b-config-out\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830417 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-serving-cert\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830511 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.830964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-trusted-ca-bundle\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-service-ca\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.830964 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.830842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-config\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.831444 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.831427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-trusted-ca-bundle\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.844192 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.844161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-serving-cert\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.851526 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.851498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-oauth-config\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.854131 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.854103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbbxs\" (UniqueName: \"kubernetes.io/projected/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-kube-api-access-rbbxs\") pod \"console-7fc48c8d8c-gw2pk\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.931129 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-config\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931129 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931129 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-web-config\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931129 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97mf2\" (UniqueName: \"kubernetes.io/projected/884d6486-6a24-4277-850c-a3725856c08b-kube-api-access-97mf2\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/884d6486-6a24-4277-850c-a3725856c08b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/884d6486-6a24-4277-850c-a3725856c08b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931457 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/884d6486-6a24-4277-850c-a3725856c08b-config-out\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931915 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931915 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.931915 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.931530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.938420 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.938396 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:22.941010 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.940992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.951543 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.951475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.952885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.952857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/884d6486-6a24-4277-850c-a3725856c08b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.953850 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.953823 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.955141 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.955120 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.955903 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.955835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.955903 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.955835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.956277 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.956239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.958672 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.958541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/884d6486-6a24-4277-850c-a3725856c08b-config-out\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.958955 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.958887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.960063 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.959490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/884d6486-6a24-4277-850c-a3725856c08b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.960063 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.959520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97mf2\" (UniqueName: \"kubernetes.io/projected/884d6486-6a24-4277-850c-a3725856c08b-kube-api-access-97mf2\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.960063 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.959869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-config\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.960621 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.960563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-web-config\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.962873 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.962835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.963696 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.963653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.963962 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.963881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/884d6486-6a24-4277-850c-a3725856c08b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:22.964962 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:22.964873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/884d6486-6a24-4277-850c-a3725856c08b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"884d6486-6a24-4277-850c-a3725856c08b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:23.000995 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:23.000956 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:23.098199 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:23.098157 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fc48c8d8c-gw2pk"] Apr 16 16:25:23.103250 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:23.103222 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716282fd_bbf8_41ad_87b6_eb3f5a556d4d.slice/crio-f61d606022d5b8f8cc929c4372cf536cdf788f334fdc764f6f30de261aea37f4 WatchSource:0}: Error finding container f61d606022d5b8f8cc929c4372cf536cdf788f334fdc764f6f30de261aea37f4: Status 404 returned error can't find the container with id f61d606022d5b8f8cc929c4372cf536cdf788f334fdc764f6f30de261aea37f4 Apr 16 16:25:23.157560 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:23.157505 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:25:23.161124 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:25:23.161097 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884d6486_6a24_4277_850c_a3725856c08b.slice/crio-0f72ff043f8edd61211aa557fc0412b4485ad05a93368cb93fdf751a56e94a9c WatchSource:0}: Error finding container 0f72ff043f8edd61211aa557fc0412b4485ad05a93368cb93fdf751a56e94a9c: Status 404 returned error can't find the container with id 0f72ff043f8edd61211aa557fc0412b4485ad05a93368cb93fdf751a56e94a9c Apr 16 16:25:23.165755 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:23.165727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fc48c8d8c-gw2pk" event={"ID":"716282fd-bbf8-41ad-87b6-eb3f5a556d4d","Type":"ContainerStarted","Data":"f61d606022d5b8f8cc929c4372cf536cdf788f334fdc764f6f30de261aea37f4"} Apr 16 16:25:23.166851 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:23.166828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"0f72ff043f8edd61211aa557fc0412b4485ad05a93368cb93fdf751a56e94a9c"} Apr 16 16:25:24.173310 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:24.173168 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fc48c8d8c-gw2pk" event={"ID":"716282fd-bbf8-41ad-87b6-eb3f5a556d4d","Type":"ContainerStarted","Data":"06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768"} Apr 16 16:25:24.202583 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:24.202506 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fc48c8d8c-gw2pk" podStartSLOduration=2.202486313 podStartE2EDuration="2.202486313s" podCreationTimestamp="2026-04-16 16:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:24.201317497 +0000 UTC m=+79.923861887" watchObservedRunningTime="2026-04-16 16:25:24.202486313 +0000 UTC m=+79.925030701" Apr 16 16:25:25.179556 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:25.179524 2578 generic.go:358] "Generic (PLEG): container finished" podID="884d6486-6a24-4277-850c-a3725856c08b" containerID="ed12bf6f7c163865fdb305463808b8519bb424e21011a8ff0ebdbd7f7a20d463" exitCode=0 Apr 16 16:25:25.179996 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:25.179614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerDied","Data":"ed12bf6f7c163865fdb305463808b8519bb424e21011a8ff0ebdbd7f7a20d463"} Apr 16 16:25:25.950685 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:25.950555 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7fc48c8d8c-gw2pk"] Apr 16 16:25:29.192909 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:29.192823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"f38891b873920f91b152886bd584b14dfae290c048f5eab825cee442c7d6754a"} Apr 16 16:25:29.192909 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:29.192864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"72313cab474afc8ee587fec00b9789d436e665c74158b7abbf1611bfe7016de8"} Apr 16 16:25:31.202074 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:31.202041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"fdfbd86b3b1c6131dfb96c5eaf8868cdeb79ecffb8c574903c2594d27f18a37b"} Apr 16 16:25:32.208056 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:32.208025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"1703969e3c3782d77696ac265ecad033d51501dc3ba0bc418aae113ae9b1b0ed"} Apr 16 16:25:32.208056 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:32.208060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"5db8fdcf3561e863403e141d41dcf2b064153f72440b938adfe727b1a85799cb"} Apr 16 16:25:32.208483 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:32.208072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"884d6486-6a24-4277-850c-a3725856c08b","Type":"ContainerStarted","Data":"ba53e3602032ae43146c399d1ee819ab71496a605f3c90e07c63be13024502f9"} Apr 16 16:25:32.262781 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:32.262718 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.421802616 podStartE2EDuration="10.262698577s" podCreationTimestamp="2026-04-16 16:25:22 +0000 UTC" firstStartedPulling="2026-04-16 16:25:23.163050034 +0000 UTC m=+78.885594406" lastFinishedPulling="2026-04-16 16:25:31.003945987 +0000 UTC m=+86.726490367" observedRunningTime="2026-04-16 16:25:32.262577882 +0000 UTC m=+87.985122273" watchObservedRunningTime="2026-04-16 16:25:32.262698577 +0000 UTC m=+87.985242964" Apr 16 16:25:32.939486 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:32.939449 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:33.001880 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:33.001843 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:25:36.466830 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:36.466793 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-567c4bcc7b-9h4hq"] Apr 16 16:25:45.282209 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.282153 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54559f49f8-wf69c" podUID="92c39430-d955-4843-8da1-226b6e806017" containerName="console" containerID="cri-o://b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3" gracePeriod=15 Apr 16 16:25:45.604082 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.604060 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54559f49f8-wf69c_92c39430-d955-4843-8da1-226b6e806017/console/0.log" Apr 16 16:25:45.604202 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.604118 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:45.641670 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.641645 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-service-ca\") pod \"92c39430-d955-4843-8da1-226b6e806017\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " Apr 16 16:25:45.641791 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.641676 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75nc9\" (UniqueName: \"kubernetes.io/projected/92c39430-d955-4843-8da1-226b6e806017-kube-api-access-75nc9\") pod \"92c39430-d955-4843-8da1-226b6e806017\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " Apr 16 16:25:45.641791 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.641717 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-serving-cert\") pod \"92c39430-d955-4843-8da1-226b6e806017\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " Apr 16 16:25:45.641916 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.641893 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-console-config\") pod \"92c39430-d955-4843-8da1-226b6e806017\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " Apr 16 16:25:45.641999 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.641982 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-oauth-config\") pod \"92c39430-d955-4843-8da1-226b6e806017\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " Apr 16 16:25:45.642094 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.642060 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-oauth-serving-cert\") pod \"92c39430-d955-4843-8da1-226b6e806017\" (UID: \"92c39430-d955-4843-8da1-226b6e806017\") " Apr 16 16:25:45.642219 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.642102 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-service-ca" (OuterVolumeSpecName: "service-ca") pod "92c39430-d955-4843-8da1-226b6e806017" (UID: "92c39430-d955-4843-8da1-226b6e806017"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:45.642309 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.642276 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-console-config" (OuterVolumeSpecName: "console-config") pod "92c39430-d955-4843-8da1-226b6e806017" (UID: "92c39430-d955-4843-8da1-226b6e806017"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:45.642492 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.642461 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-service-ca\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:45.642492 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.642488 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-console-config\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:45.642631 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.642527 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "92c39430-d955-4843-8da1-226b6e806017" (UID: "92c39430-d955-4843-8da1-226b6e806017"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:45.644083 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.644052 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "92c39430-d955-4843-8da1-226b6e806017" (UID: "92c39430-d955-4843-8da1-226b6e806017"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:45.644195 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.644079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c39430-d955-4843-8da1-226b6e806017-kube-api-access-75nc9" (OuterVolumeSpecName: "kube-api-access-75nc9") pod "92c39430-d955-4843-8da1-226b6e806017" (UID: "92c39430-d955-4843-8da1-226b6e806017"). InnerVolumeSpecName "kube-api-access-75nc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:45.644195 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.644158 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "92c39430-d955-4843-8da1-226b6e806017" (UID: "92c39430-d955-4843-8da1-226b6e806017"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:45.743382 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.743349 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-oauth-config\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:45.743382 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.743377 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92c39430-d955-4843-8da1-226b6e806017-oauth-serving-cert\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:45.743382 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.743388 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75nc9\" (UniqueName: \"kubernetes.io/projected/92c39430-d955-4843-8da1-226b6e806017-kube-api-access-75nc9\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:45.743600 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:45.743397 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92c39430-d955-4843-8da1-226b6e806017-console-serving-cert\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:46.249626 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.249600 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54559f49f8-wf69c_92c39430-d955-4843-8da1-226b6e806017/console/0.log" Apr 16 16:25:46.249806 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.249640 2578 generic.go:358] "Generic (PLEG): container finished" podID="92c39430-d955-4843-8da1-226b6e806017" containerID="b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3" exitCode=2 Apr 16 16:25:46.249806 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.249701 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559f49f8-wf69c" Apr 16 16:25:46.249806 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.249728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559f49f8-wf69c" event={"ID":"92c39430-d955-4843-8da1-226b6e806017","Type":"ContainerDied","Data":"b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3"} Apr 16 16:25:46.249806 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.249773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559f49f8-wf69c" event={"ID":"92c39430-d955-4843-8da1-226b6e806017","Type":"ContainerDied","Data":"9525d3954937108178beea9953435a3db320dc1995c86ec2b0766c45a37cbf3b"} Apr 16 16:25:46.249806 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.249794 2578 scope.go:117] "RemoveContainer" containerID="b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3" Apr 16 16:25:46.257745 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.257729 2578 scope.go:117] "RemoveContainer" containerID="b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3" Apr 16 16:25:46.258041 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:25:46.258023 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3\": container with ID starting with b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3 not found: ID does not exist" containerID="b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3" Apr 16 16:25:46.258100 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.258049 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3"} err="failed to get container status \"b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3\": rpc error: code = NotFound desc = could not find container \"b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3\": container with ID starting with b16adef4bb08492c8fe3e20596cf72b93287b610cb063c725ba1a14a015662b3 not found: ID does not exist" Apr 16 16:25:46.271879 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.271858 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54559f49f8-wf69c"] Apr 16 16:25:46.279075 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.277657 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54559f49f8-wf69c"] Apr 16 16:25:46.854226 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:46.854196 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c39430-d955-4843-8da1-226b6e806017" path="/var/lib/kubelet/pods/92c39430-d955-4843-8da1-226b6e806017/volumes" Apr 16 16:25:49.149557 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:49.149530 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n26xz" Apr 16 16:25:51.208154 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.208118 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7fc48c8d8c-gw2pk" podUID="716282fd-bbf8-41ad-87b6-eb3f5a556d4d" containerName="console" containerID="cri-o://06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768" gracePeriod=15 Apr 16 16:25:51.475534 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.475509 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fc48c8d8c-gw2pk_716282fd-bbf8-41ad-87b6-eb3f5a556d4d/console/0.log" Apr 16 16:25:51.475677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.475581 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:51.589770 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.589736 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-serving-cert\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.589770 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.589775 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-service-ca\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.590037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.589804 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbbxs\" (UniqueName: \"kubernetes.io/projected/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-kube-api-access-rbbxs\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.590037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.589824 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-oauth-config\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.590037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.589962 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-config\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.590037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.590004 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-oauth-serving-cert\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.590992 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.590442 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-trusted-ca-bundle\") pod \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\" (UID: \"716282fd-bbf8-41ad-87b6-eb3f5a556d4d\") " Apr 16 16:25:51.590992 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.590659 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-config" (OuterVolumeSpecName: "console-config") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:51.590992 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.590686 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-service-ca" (OuterVolumeSpecName: "service-ca") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:51.590992 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.590693 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:51.591290 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.591078 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:51.591290 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.591270 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-service-ca\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:51.591290 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.591289 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-config\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:51.591455 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.591304 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-oauth-serving-cert\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:51.592781 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.592750 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:51.594976 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.593033 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:51.596740 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.596712 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-kube-api-access-rbbxs" (OuterVolumeSpecName: "kube-api-access-rbbxs") pod "716282fd-bbf8-41ad-87b6-eb3f5a556d4d" (UID: "716282fd-bbf8-41ad-87b6-eb3f5a556d4d"). InnerVolumeSpecName "kube-api-access-rbbxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:51.692187 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.692154 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbbxs\" (UniqueName: \"kubernetes.io/projected/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-kube-api-access-rbbxs\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:51.692187 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.692181 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-oauth-config\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:51.692187 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.692191 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-trusted-ca-bundle\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:51.692401 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:51.692201 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/716282fd-bbf8-41ad-87b6-eb3f5a556d4d-console-serving-cert\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:25:52.268076 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.268049 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fc48c8d8c-gw2pk_716282fd-bbf8-41ad-87b6-eb3f5a556d4d/console/0.log" Apr 16 16:25:52.268479 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.268085 2578 generic.go:358] "Generic (PLEG): container finished" podID="716282fd-bbf8-41ad-87b6-eb3f5a556d4d" containerID="06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768" exitCode=2 Apr 16 16:25:52.268479 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.268123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fc48c8d8c-gw2pk" event={"ID":"716282fd-bbf8-41ad-87b6-eb3f5a556d4d","Type":"ContainerDied","Data":"06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768"} Apr 16 16:25:52.268479 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.268148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fc48c8d8c-gw2pk" event={"ID":"716282fd-bbf8-41ad-87b6-eb3f5a556d4d","Type":"ContainerDied","Data":"f61d606022d5b8f8cc929c4372cf536cdf788f334fdc764f6f30de261aea37f4"} Apr 16 16:25:52.268479 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.268162 2578 scope.go:117] "RemoveContainer" containerID="06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768" Apr 16 16:25:52.268479 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.268180 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fc48c8d8c-gw2pk" Apr 16 16:25:52.276533 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.276516 2578 scope.go:117] "RemoveContainer" containerID="06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768" Apr 16 16:25:52.276813 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:25:52.276777 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768\": container with ID starting with 06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768 not found: ID does not exist" containerID="06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768" Apr 16 16:25:52.276863 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.276823 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768"} err="failed to get container status \"06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768\": rpc error: code = NotFound desc = could not find container \"06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768\": container with ID starting with 06238d546e4a3e57e503775ce57d04a4a153f198f46bc425b96d129e1ce90768 not found: ID does not exist" Apr 16 16:25:52.295000 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.294975 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7fc48c8d8c-gw2pk"] Apr 16 16:25:52.305460 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.305438 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7fc48c8d8c-gw2pk"] Apr 16 16:25:52.854222 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:25:52.854191 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716282fd-bbf8-41ad-87b6-eb3f5a556d4d" path="/var/lib/kubelet/pods/716282fd-bbf8-41ad-87b6-eb3f5a556d4d/volumes" Apr 16 16:26:01.486646 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.486580 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-567c4bcc7b-9h4hq" podUID="c6bdb172-fc8d-4b50-b31d-7e254b8df08a" containerName="console" containerID="cri-o://326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052" gracePeriod=15 Apr 16 16:26:01.754842 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.754815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567c4bcc7b-9h4hq_c6bdb172-fc8d-4b50-b31d-7e254b8df08a/console/0.log" Apr 16 16:26:01.754989 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.754881 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:26:01.868438 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868402 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-oauth-serving-cert\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.868615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868450 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-oauth-config\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.868615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868474 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72658\" (UniqueName: \"kubernetes.io/projected/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-kube-api-access-72658\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.868615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868499 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-config\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.868615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868551 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-service-ca\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.868615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868576 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-trusted-ca-bundle\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.868615 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868608 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-serving-cert\") pod \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\" (UID: \"c6bdb172-fc8d-4b50-b31d-7e254b8df08a\") " Apr 16 16:26:01.869026 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.868991 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:01.869026 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.869017 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-config" (OuterVolumeSpecName: "console-config") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:01.869204 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.869124 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-service-ca" (OuterVolumeSpecName: "service-ca") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:01.869204 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.869122 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:01.870600 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.870569 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-kube-api-access-72658" (OuterVolumeSpecName: "kube-api-access-72658") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "kube-api-access-72658". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:01.870694 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.870678 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:01.870736 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.870698 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c6bdb172-fc8d-4b50-b31d-7e254b8df08a" (UID: "c6bdb172-fc8d-4b50-b31d-7e254b8df08a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:01.970086 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970043 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-config\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:01.970086 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970081 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-service-ca\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:01.970624 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970092 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-trusted-ca-bundle\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:01.970760 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970636 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-serving-cert\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:01.970760 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970672 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-oauth-serving-cert\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:01.970760 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970688 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-console-oauth-config\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:01.970760 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:01.970705 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72658\" (UniqueName: \"kubernetes.io/projected/c6bdb172-fc8d-4b50-b31d-7e254b8df08a-kube-api-access-72658\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:26:02.297908 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.297880 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567c4bcc7b-9h4hq_c6bdb172-fc8d-4b50-b31d-7e254b8df08a/console/0.log" Apr 16 16:26:02.298098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.297934 2578 generic.go:358] "Generic (PLEG): container finished" podID="c6bdb172-fc8d-4b50-b31d-7e254b8df08a" containerID="326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052" exitCode=2 Apr 16 16:26:02.298098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.298002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567c4bcc7b-9h4hq" event={"ID":"c6bdb172-fc8d-4b50-b31d-7e254b8df08a","Type":"ContainerDied","Data":"326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052"} Apr 16 16:26:02.298098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.298035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567c4bcc7b-9h4hq" event={"ID":"c6bdb172-fc8d-4b50-b31d-7e254b8df08a","Type":"ContainerDied","Data":"bf305e5130bda27b493f87d2348ae9741f58f049e095ffd45e65db5a7e993943"} Apr 16 16:26:02.298098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.298054 2578 scope.go:117] "RemoveContainer" containerID="326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052" Apr 16 16:26:02.298098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.298055 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567c4bcc7b-9h4hq" Apr 16 16:26:02.307386 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.307314 2578 scope.go:117] "RemoveContainer" containerID="326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052" Apr 16 16:26:02.307660 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:26:02.307641 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052\": container with ID starting with 326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052 not found: ID does not exist" containerID="326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052" Apr 16 16:26:02.307725 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.307668 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052"} err="failed to get container status \"326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052\": rpc error: code = NotFound desc = could not find container \"326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052\": container with ID starting with 326df460321e86cc87adfdd626b535236faf1c7189981d93219792ce00023052 not found: ID does not exist" Apr 16 16:26:02.323717 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.323690 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-567c4bcc7b-9h4hq"] Apr 16 16:26:02.330660 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.330636 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-567c4bcc7b-9h4hq"] Apr 16 16:26:02.853274 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:02.853240 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bdb172-fc8d-4b50-b31d-7e254b8df08a" path="/var/lib/kubelet/pods/c6bdb172-fc8d-4b50-b31d-7e254b8df08a/volumes" Apr 16 16:26:23.001968 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:23.001936 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:23.020144 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:23.020115 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:26:23.372048 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:26:23.372018 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:27:05.280627 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280596 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cb8q7"] Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280845 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="716282fd-bbf8-41ad-87b6-eb3f5a556d4d" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280856 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="716282fd-bbf8-41ad-87b6-eb3f5a556d4d" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280869 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c39430-d955-4843-8da1-226b6e806017" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280875 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c39430-d955-4843-8da1-226b6e806017" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280889 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bdb172-fc8d-4b50-b31d-7e254b8df08a" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280894 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bdb172-fc8d-4b50-b31d-7e254b8df08a" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280946 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="716282fd-bbf8-41ad-87b6-eb3f5a556d4d" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280956 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6bdb172-fc8d-4b50-b31d-7e254b8df08a" containerName="console" Apr 16 16:27:05.281105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.280962 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="92c39430-d955-4843-8da1-226b6e806017" containerName="console" Apr 16 16:27:05.283780 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.283762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.286143 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.286123 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:27:05.292023 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.291997 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cb8q7"] Apr 16 16:27:05.358070 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.358032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5a24ad4-a379-47f2-bc15-10eccd9d9898-original-pull-secret\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.358070 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.358070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5a24ad4-a379-47f2-bc15-10eccd9d9898-kubelet-config\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.358273 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.358103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5a24ad4-a379-47f2-bc15-10eccd9d9898-dbus\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.458753 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.458714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5a24ad4-a379-47f2-bc15-10eccd9d9898-original-pull-secret\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.458753 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.458754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5a24ad4-a379-47f2-bc15-10eccd9d9898-kubelet-config\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.458970 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.458780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5a24ad4-a379-47f2-bc15-10eccd9d9898-dbus\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.458970 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.458951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5a24ad4-a379-47f2-bc15-10eccd9d9898-kubelet-config\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.458970 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.458967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5a24ad4-a379-47f2-bc15-10eccd9d9898-dbus\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.461010 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.460995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5a24ad4-a379-47f2-bc15-10eccd9d9898-original-pull-secret\") pod \"global-pull-secret-syncer-cb8q7\" (UID: \"f5a24ad4-a379-47f2-bc15-10eccd9d9898\") " pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.593115 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.593031 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cb8q7" Apr 16 16:27:05.706460 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:05.706423 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cb8q7"] Apr 16 16:27:05.709252 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:27:05.709221 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a24ad4_a379_47f2_bc15_10eccd9d9898.slice/crio-ea19623fdbb22663c3660d18a92b93a51be42890c2efe3e73fa4b94f12a79ea0 WatchSource:0}: Error finding container ea19623fdbb22663c3660d18a92b93a51be42890c2efe3e73fa4b94f12a79ea0: Status 404 returned error can't find the container with id ea19623fdbb22663c3660d18a92b93a51be42890c2efe3e73fa4b94f12a79ea0 Apr 16 16:27:06.470351 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:06.470311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cb8q7" event={"ID":"f5a24ad4-a379-47f2-bc15-10eccd9d9898","Type":"ContainerStarted","Data":"ea19623fdbb22663c3660d18a92b93a51be42890c2efe3e73fa4b94f12a79ea0"} Apr 16 16:27:10.483286 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:10.483195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cb8q7" event={"ID":"f5a24ad4-a379-47f2-bc15-10eccd9d9898","Type":"ContainerStarted","Data":"f4b4b68b87f8e07e527c8d997dd361d43cef49dd95bbc574cccb7e4019652741"} Apr 16 16:27:10.500162 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:27:10.500111 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cb8q7" podStartSLOduration=1.059430114 podStartE2EDuration="5.500095534s" podCreationTimestamp="2026-04-16 16:27:05 +0000 UTC" firstStartedPulling="2026-04-16 16:27:05.710822489 +0000 UTC m=+181.433366856" lastFinishedPulling="2026-04-16 16:27:10.151487903 +0000 UTC m=+185.874032276" observedRunningTime="2026-04-16 16:27:10.499441058 +0000 UTC m=+186.221985450" watchObservedRunningTime="2026-04-16 16:27:10.500095534 +0000 UTC m=+186.222639923" Apr 16 16:29:04.740364 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:29:04.740329 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:29:04.740913 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:29:04.740417 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:29:04.743253 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:29:04.743232 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:34:04.763534 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:34:04.763507 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:34:04.764913 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:34:04.764893 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:39:04.779850 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:39:04.779821 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:39:04.782161 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:39:04.782140 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:40:11.000419 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.000387 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-vt4x6"] Apr 16 16:40:11.003371 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.003355 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.007543 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.007523 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:40:11.007668 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.007592 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:40:11.007724 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.007663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlkgs\" (UniqueName: \"kubernetes.io/projected/44fe971d-bc11-4b96-b457-a2b8b58c0639-kube-api-access-zlkgs\") pod \"seaweedfs-86cc847c5c-vt4x6\" (UID: \"44fe971d-bc11-4b96-b457-a2b8b58c0639\") " pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.007778 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.007766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/44fe971d-bc11-4b96-b457-a2b8b58c0639-data\") pod \"seaweedfs-86cc847c5c-vt4x6\" (UID: \"44fe971d-bc11-4b96-b457-a2b8b58c0639\") " pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.007833 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.007787 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lgzms\"" Apr 16 16:40:11.009753 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.009734 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:40:11.014208 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.014178 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-vt4x6"] Apr 16 16:40:11.108459 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.108425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/44fe971d-bc11-4b96-b457-a2b8b58c0639-data\") pod \"seaweedfs-86cc847c5c-vt4x6\" (UID: \"44fe971d-bc11-4b96-b457-a2b8b58c0639\") " pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.108620 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.108470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlkgs\" (UniqueName: \"kubernetes.io/projected/44fe971d-bc11-4b96-b457-a2b8b58c0639-kube-api-access-zlkgs\") pod \"seaweedfs-86cc847c5c-vt4x6\" (UID: \"44fe971d-bc11-4b96-b457-a2b8b58c0639\") " pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.108797 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.108777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/44fe971d-bc11-4b96-b457-a2b8b58c0639-data\") pod \"seaweedfs-86cc847c5c-vt4x6\" (UID: \"44fe971d-bc11-4b96-b457-a2b8b58c0639\") " pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.117751 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.117728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlkgs\" (UniqueName: \"kubernetes.io/projected/44fe971d-bc11-4b96-b457-a2b8b58c0639-kube-api-access-zlkgs\") pod \"seaweedfs-86cc847c5c-vt4x6\" (UID: \"44fe971d-bc11-4b96-b457-a2b8b58c0639\") " pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.313438 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.313406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:11.432567 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.432433 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-vt4x6"] Apr 16 16:40:11.435281 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:40:11.435255 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44fe971d_bc11_4b96_b457_a2b8b58c0639.slice/crio-7e6497aede0439c2068885b8e225789f0de58c8c5c1fe93db947a3eaaf51849b WatchSource:0}: Error finding container 7e6497aede0439c2068885b8e225789f0de58c8c5c1fe93db947a3eaaf51849b: Status 404 returned error can't find the container with id 7e6497aede0439c2068885b8e225789f0de58c8c5c1fe93db947a3eaaf51849b Apr 16 16:40:11.436804 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.436789 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:40:11.544141 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:11.544107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-vt4x6" event={"ID":"44fe971d-bc11-4b96-b457-a2b8b58c0639","Type":"ContainerStarted","Data":"7e6497aede0439c2068885b8e225789f0de58c8c5c1fe93db947a3eaaf51849b"} Apr 16 16:40:14.553936 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:14.553888 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-vt4x6" event={"ID":"44fe971d-bc11-4b96-b457-a2b8b58c0639","Type":"ContainerStarted","Data":"f961e1fc6622cb90b3827d67e75d0f9bf99cde7ee22b49d65abbfb248dd5ffde"} Apr 16 16:40:14.554315 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:14.554004 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:40:14.572767 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:14.572717 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-vt4x6" podStartSLOduration=2.077153629 podStartE2EDuration="4.572703073s" podCreationTimestamp="2026-04-16 16:40:10 +0000 UTC" firstStartedPulling="2026-04-16 16:40:11.43695485 +0000 UTC m=+967.159499217" lastFinishedPulling="2026-04-16 16:40:13.932504293 +0000 UTC m=+969.655048661" observedRunningTime="2026-04-16 16:40:14.571641398 +0000 UTC m=+970.294185787" watchObservedRunningTime="2026-04-16 16:40:14.572703073 +0000 UTC m=+970.295247464" Apr 16 16:40:20.559112 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:40:20.559085 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-vt4x6" Apr 16 16:41:20.845888 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.845853 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-jpt94"] Apr 16 16:41:20.848147 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.848126 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:20.850695 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.850668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4125926-3da6-4244-9477-f06e200b8975-tls-certs\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:20.850793 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.850773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g56qc\" (UniqueName: \"kubernetes.io/projected/b4125926-3da6-4244-9477-f06e200b8975-kube-api-access-g56qc\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:20.851225 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.851209 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 16:41:20.851322 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.851242 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vg6hb\"" Apr 16 16:41:20.859085 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.859063 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-jpt94"] Apr 16 16:41:20.866837 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.866814 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-kx9rj"] Apr 16 16:41:20.868763 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.868748 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:20.871276 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.871258 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-sl4kn\"" Apr 16 16:41:20.871276 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.871279 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:41:20.879986 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.879966 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kx9rj"] Apr 16 16:41:20.951486 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.951437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g56qc\" (UniqueName: \"kubernetes.io/projected/b4125926-3da6-4244-9477-f06e200b8975-kube-api-access-g56qc\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:20.951670 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.951499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4125926-3da6-4244-9477-f06e200b8975-tls-certs\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:20.951670 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:41:20.951600 2578 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 16:41:20.951670 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:41:20.951645 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4125926-3da6-4244-9477-f06e200b8975-tls-certs podName:b4125926-3da6-4244-9477-f06e200b8975 nodeName:}" failed. No retries permitted until 2026-04-16 16:41:21.451631484 +0000 UTC m=+1037.174175851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/b4125926-3da6-4244-9477-f06e200b8975-tls-certs") pod "model-serving-api-86f7b4b499-jpt94" (UID: "b4125926-3da6-4244-9477-f06e200b8975") : secret "model-serving-api-tls" not found Apr 16 16:41:20.960054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:20.960030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g56qc\" (UniqueName: \"kubernetes.io/projected/b4125926-3da6-4244-9477-f06e200b8975-kube-api-access-g56qc\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:21.052340 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.052311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bda92557-f9bd-421c-9ea3-ee47a8c11b2f-cert\") pod \"odh-model-controller-696fc77849-kx9rj\" (UID: \"bda92557-f9bd-421c-9ea3-ee47a8c11b2f\") " pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.052509 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.052391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr84f\" (UniqueName: \"kubernetes.io/projected/bda92557-f9bd-421c-9ea3-ee47a8c11b2f-kube-api-access-wr84f\") pod \"odh-model-controller-696fc77849-kx9rj\" (UID: \"bda92557-f9bd-421c-9ea3-ee47a8c11b2f\") " pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.153134 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.153049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr84f\" (UniqueName: \"kubernetes.io/projected/bda92557-f9bd-421c-9ea3-ee47a8c11b2f-kube-api-access-wr84f\") pod \"odh-model-controller-696fc77849-kx9rj\" (UID: \"bda92557-f9bd-421c-9ea3-ee47a8c11b2f\") " pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.153288 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.153213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bda92557-f9bd-421c-9ea3-ee47a8c11b2f-cert\") pod \"odh-model-controller-696fc77849-kx9rj\" (UID: \"bda92557-f9bd-421c-9ea3-ee47a8c11b2f\") " pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.155682 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.155660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bda92557-f9bd-421c-9ea3-ee47a8c11b2f-cert\") pod \"odh-model-controller-696fc77849-kx9rj\" (UID: \"bda92557-f9bd-421c-9ea3-ee47a8c11b2f\") " pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.164254 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.164230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr84f\" (UniqueName: \"kubernetes.io/projected/bda92557-f9bd-421c-9ea3-ee47a8c11b2f-kube-api-access-wr84f\") pod \"odh-model-controller-696fc77849-kx9rj\" (UID: \"bda92557-f9bd-421c-9ea3-ee47a8c11b2f\") " pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.179053 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.179028 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:21.295450 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.293891 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kx9rj"] Apr 16 16:41:21.301458 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:41:21.301428 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda92557_f9bd_421c_9ea3_ee47a8c11b2f.slice/crio-9df12d5dad50cda99c3676d148ac759a5f3214001c969453a088f3c8a2e75586 WatchSource:0}: Error finding container 9df12d5dad50cda99c3676d148ac759a5f3214001c969453a088f3c8a2e75586: Status 404 returned error can't find the container with id 9df12d5dad50cda99c3676d148ac759a5f3214001c969453a088f3c8a2e75586 Apr 16 16:41:21.456105 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.456016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4125926-3da6-4244-9477-f06e200b8975-tls-certs\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:21.458333 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.458314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4125926-3da6-4244-9477-f06e200b8975-tls-certs\") pod \"model-serving-api-86f7b4b499-jpt94\" (UID: \"b4125926-3da6-4244-9477-f06e200b8975\") " pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:21.461202 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.461183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:21.578167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.578079 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-jpt94"] Apr 16 16:41:21.580581 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:41:21.580542 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4125926_3da6_4244_9477_f06e200b8975.slice/crio-05d602f4dcfae895fa137757f9f8bb65cb87a1fb2b3bd89a64dcd34348e24e29 WatchSource:0}: Error finding container 05d602f4dcfae895fa137757f9f8bb65cb87a1fb2b3bd89a64dcd34348e24e29: Status 404 returned error can't find the container with id 05d602f4dcfae895fa137757f9f8bb65cb87a1fb2b3bd89a64dcd34348e24e29 Apr 16 16:41:21.730654 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.730568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-jpt94" event={"ID":"b4125926-3da6-4244-9477-f06e200b8975","Type":"ContainerStarted","Data":"05d602f4dcfae895fa137757f9f8bb65cb87a1fb2b3bd89a64dcd34348e24e29"} Apr 16 16:41:21.731499 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:21.731480 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kx9rj" event={"ID":"bda92557-f9bd-421c-9ea3-ee47a8c11b2f","Type":"ContainerStarted","Data":"9df12d5dad50cda99c3676d148ac759a5f3214001c969453a088f3c8a2e75586"} Apr 16 16:41:25.749308 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:25.749269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-jpt94" event={"ID":"b4125926-3da6-4244-9477-f06e200b8975","Type":"ContainerStarted","Data":"ab1faef26883ce37b8ff05ba590c9692be5ad7ab34e5109e07a0e2cb5d1d362a"} Apr 16 16:41:25.749756 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:25.749502 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:25.750651 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:25.750630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kx9rj" event={"ID":"bda92557-f9bd-421c-9ea3-ee47a8c11b2f","Type":"ContainerStarted","Data":"5d11ce5695512941e12afd902b6df0117325f488faee7ef9621e49db7027e620"} Apr 16 16:41:25.750767 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:25.750754 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:25.767874 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:25.767816 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-jpt94" podStartSLOduration=2.377328912 podStartE2EDuration="5.767801497s" podCreationTimestamp="2026-04-16 16:41:20 +0000 UTC" firstStartedPulling="2026-04-16 16:41:21.582300674 +0000 UTC m=+1037.304845041" lastFinishedPulling="2026-04-16 16:41:24.972773255 +0000 UTC m=+1040.695317626" observedRunningTime="2026-04-16 16:41:25.766326508 +0000 UTC m=+1041.488870898" watchObservedRunningTime="2026-04-16 16:41:25.767801497 +0000 UTC m=+1041.490345887" Apr 16 16:41:25.783644 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:25.783597 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-kx9rj" podStartSLOduration=2.122887675 podStartE2EDuration="5.783582399s" podCreationTimestamp="2026-04-16 16:41:20 +0000 UTC" firstStartedPulling="2026-04-16 16:41:21.302389459 +0000 UTC m=+1037.024933826" lastFinishedPulling="2026-04-16 16:41:24.96308418 +0000 UTC m=+1040.685628550" observedRunningTime="2026-04-16 16:41:25.783216132 +0000 UTC m=+1041.505760519" watchObservedRunningTime="2026-04-16 16:41:25.783582399 +0000 UTC m=+1041.506126788" Apr 16 16:41:36.760089 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:36.760049 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-kx9rj" Apr 16 16:41:36.762761 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:36.762744 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-jpt94" Apr 16 16:41:37.578661 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.578630 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-lks8k"] Apr 16 16:41:37.584726 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.584710 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lks8k" Apr 16 16:41:37.592692 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.592666 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-lks8k"] Apr 16 16:41:37.679151 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.679114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxf94\" (UniqueName: \"kubernetes.io/projected/70f5c9b7-0054-4a03-9d0e-4098e0c8ba03-kube-api-access-vxf94\") pod \"s3-init-lks8k\" (UID: \"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03\") " pod="kserve/s3-init-lks8k" Apr 16 16:41:37.779620 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.779590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxf94\" (UniqueName: \"kubernetes.io/projected/70f5c9b7-0054-4a03-9d0e-4098e0c8ba03-kube-api-access-vxf94\") pod \"s3-init-lks8k\" (UID: \"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03\") " pod="kserve/s3-init-lks8k" Apr 16 16:41:37.789236 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.789202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxf94\" (UniqueName: \"kubernetes.io/projected/70f5c9b7-0054-4a03-9d0e-4098e0c8ba03-kube-api-access-vxf94\") pod \"s3-init-lks8k\" (UID: \"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03\") " pod="kserve/s3-init-lks8k" Apr 16 16:41:37.908482 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:37.908391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lks8k" Apr 16 16:41:38.024254 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:38.024114 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-lks8k"] Apr 16 16:41:38.026690 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:41:38.026662 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f5c9b7_0054_4a03_9d0e_4098e0c8ba03.slice/crio-51d1440d9d518e17353c0501d2f9c178da13f47c42b515a50cf08cb463cb5e42 WatchSource:0}: Error finding container 51d1440d9d518e17353c0501d2f9c178da13f47c42b515a50cf08cb463cb5e42: Status 404 returned error can't find the container with id 51d1440d9d518e17353c0501d2f9c178da13f47c42b515a50cf08cb463cb5e42 Apr 16 16:41:38.794261 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:38.794222 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lks8k" event={"ID":"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03","Type":"ContainerStarted","Data":"51d1440d9d518e17353c0501d2f9c178da13f47c42b515a50cf08cb463cb5e42"} Apr 16 16:41:43.808986 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:43.808946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lks8k" event={"ID":"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03","Type":"ContainerStarted","Data":"d34ee7b51858eb9bec1490c106308b7e305077f5e76084e42e31897bdc596129"} Apr 16 16:41:43.826940 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:43.826861 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-lks8k" podStartSLOduration=1.762229779 podStartE2EDuration="6.826843391s" podCreationTimestamp="2026-04-16 16:41:37 +0000 UTC" firstStartedPulling="2026-04-16 16:41:38.028424139 +0000 UTC m=+1053.750968523" lastFinishedPulling="2026-04-16 16:41:43.093037764 +0000 UTC m=+1058.815582135" observedRunningTime="2026-04-16 16:41:43.825866714 +0000 UTC m=+1059.548411105" watchObservedRunningTime="2026-04-16 16:41:43.826843391 +0000 UTC m=+1059.549387782" Apr 16 16:41:46.817037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:46.817001 2578 generic.go:358] "Generic (PLEG): container finished" podID="70f5c9b7-0054-4a03-9d0e-4098e0c8ba03" containerID="d34ee7b51858eb9bec1490c106308b7e305077f5e76084e42e31897bdc596129" exitCode=0 Apr 16 16:41:46.817411 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:46.817073 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lks8k" event={"ID":"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03","Type":"ContainerDied","Data":"d34ee7b51858eb9bec1490c106308b7e305077f5e76084e42e31897bdc596129"} Apr 16 16:41:47.945098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:47.945076 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lks8k" Apr 16 16:41:48.064380 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:48.064346 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxf94\" (UniqueName: \"kubernetes.io/projected/70f5c9b7-0054-4a03-9d0e-4098e0c8ba03-kube-api-access-vxf94\") pod \"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03\" (UID: \"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03\") " Apr 16 16:41:48.066489 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:48.066458 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f5c9b7-0054-4a03-9d0e-4098e0c8ba03-kube-api-access-vxf94" (OuterVolumeSpecName: "kube-api-access-vxf94") pod "70f5c9b7-0054-4a03-9d0e-4098e0c8ba03" (UID: "70f5c9b7-0054-4a03-9d0e-4098e0c8ba03"). InnerVolumeSpecName "kube-api-access-vxf94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:41:48.165047 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:48.164963 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxf94\" (UniqueName: \"kubernetes.io/projected/70f5c9b7-0054-4a03-9d0e-4098e0c8ba03-kube-api-access-vxf94\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:41:48.824294 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:48.824269 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lks8k" Apr 16 16:41:48.824294 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:48.824282 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lks8k" event={"ID":"70f5c9b7-0054-4a03-9d0e-4098e0c8ba03","Type":"ContainerDied","Data":"51d1440d9d518e17353c0501d2f9c178da13f47c42b515a50cf08cb463cb5e42"} Apr 16 16:41:48.824485 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:48.824313 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51d1440d9d518e17353c0501d2f9c178da13f47c42b515a50cf08cb463cb5e42" Apr 16 16:41:56.408599 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.408527 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml"] Apr 16 16:41:56.409037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.408799 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f5c9b7-0054-4a03-9d0e-4098e0c8ba03" containerName="s3-init" Apr 16 16:41:56.409037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.408811 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f5c9b7-0054-4a03-9d0e-4098e0c8ba03" containerName="s3-init" Apr 16 16:41:56.409037 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.408872 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="70f5c9b7-0054-4a03-9d0e-4098e0c8ba03" containerName="s3-init" Apr 16 16:41:56.411759 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.411743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:41:56.440250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.440218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zzfkf\"" Apr 16 16:41:56.456338 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.456310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml"] Apr 16 16:41:56.528786 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.528755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/079ad2c9-be6c-443d-bb79-d883f22b2614-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-84f646df87-ws2ml\" (UID: \"079ad2c9-be6c-443d-bb79-d883f22b2614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:41:56.629893 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.629858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/079ad2c9-be6c-443d-bb79-d883f22b2614-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-84f646df87-ws2ml\" (UID: \"079ad2c9-be6c-443d-bb79-d883f22b2614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:41:56.630273 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.630256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/079ad2c9-be6c-443d-bb79-d883f22b2614-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-84f646df87-ws2ml\" (UID: \"079ad2c9-be6c-443d-bb79-d883f22b2614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:41:56.720731 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.720653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:41:56.872316 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.872288 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml"] Apr 16 16:41:56.875412 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:41:56.875380 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod079ad2c9_be6c_443d_bb79_d883f22b2614.slice/crio-d50345ea40ca4e493265b73094c61273e14db0e53787561fdb45a3f1ee799e6c WatchSource:0}: Error finding container d50345ea40ca4e493265b73094c61273e14db0e53787561fdb45a3f1ee799e6c: Status 404 returned error can't find the container with id d50345ea40ca4e493265b73094c61273e14db0e53787561fdb45a3f1ee799e6c Apr 16 16:41:56.893754 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.893728 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4"] Apr 16 16:41:56.898295 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.898280 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:41:56.922497 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:56.922458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4"] Apr 16 16:41:57.027580 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.027543 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6"] Apr 16 16:41:57.030787 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.030766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" Apr 16 16:41:57.033673 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.033651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8ae0449-aec8-4633-997a-7137047bd4ae-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-b6mw4\" (UID: \"a8ae0449-aec8-4633-997a-7137047bd4ae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:41:57.040220 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.040203 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" Apr 16 16:41:57.051717 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.051670 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6"] Apr 16 16:41:57.134520 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.134484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8ae0449-aec8-4633-997a-7137047bd4ae-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-b6mw4\" (UID: \"a8ae0449-aec8-4633-997a-7137047bd4ae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:41:57.134884 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.134860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8ae0449-aec8-4633-997a-7137047bd4ae-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-b6mw4\" (UID: \"a8ae0449-aec8-4633-997a-7137047bd4ae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:41:57.208284 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.208245 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:41:57.295594 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.295569 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6"] Apr 16 16:41:57.297361 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:41:57.297334 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84927f2c_3dcf_4a35_9cdd_6f8b51ed08d7.slice/crio-597b36ef6b1adbb98961736c189f7e766c0bc6a31a75c61bf1221166d5ed0263 WatchSource:0}: Error finding container 597b36ef6b1adbb98961736c189f7e766c0bc6a31a75c61bf1221166d5ed0263: Status 404 returned error can't find the container with id 597b36ef6b1adbb98961736c189f7e766c0bc6a31a75c61bf1221166d5ed0263 Apr 16 16:41:57.357543 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.357506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4"] Apr 16 16:41:57.361752 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:41:57.361722 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ae0449_aec8_4633_997a_7137047bd4ae.slice/crio-32c0764fe27a780f058898999d071008eb9a989250c480d72a979eaafb36fb67 WatchSource:0}: Error finding container 32c0764fe27a780f058898999d071008eb9a989250c480d72a979eaafb36fb67: Status 404 returned error can't find the container with id 32c0764fe27a780f058898999d071008eb9a989250c480d72a979eaafb36fb67 Apr 16 16:41:57.522069 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.522035 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c"] Apr 16 16:41:57.525852 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.525826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:41:57.586907 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.586855 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c"] Apr 16 16:41:57.639239 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.639190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30afbb56-ef59-4826-802f-2efdf8c52792-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-555687cc47-2p67c\" (UID: \"30afbb56-ef59-4826-802f-2efdf8c52792\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:41:57.740646 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.740547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30afbb56-ef59-4826-802f-2efdf8c52792-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-555687cc47-2p67c\" (UID: \"30afbb56-ef59-4826-802f-2efdf8c52792\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:41:57.740988 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.740963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30afbb56-ef59-4826-802f-2efdf8c52792-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-555687cc47-2p67c\" (UID: \"30afbb56-ef59-4826-802f-2efdf8c52792\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:41:57.843128 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.843037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:41:57.855064 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.855018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" event={"ID":"84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7","Type":"ContainerStarted","Data":"597b36ef6b1adbb98961736c189f7e766c0bc6a31a75c61bf1221166d5ed0263"} Apr 16 16:41:57.859619 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.859564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" event={"ID":"079ad2c9-be6c-443d-bb79-d883f22b2614","Type":"ContainerStarted","Data":"d50345ea40ca4e493265b73094c61273e14db0e53787561fdb45a3f1ee799e6c"} Apr 16 16:41:57.865351 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:57.865289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" event={"ID":"a8ae0449-aec8-4633-997a-7137047bd4ae","Type":"ContainerStarted","Data":"32c0764fe27a780f058898999d071008eb9a989250c480d72a979eaafb36fb67"} Apr 16 16:41:58.085259 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:58.085199 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c"] Apr 16 16:41:58.873438 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:41:58.873377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" event={"ID":"30afbb56-ef59-4826-802f-2efdf8c52792","Type":"ContainerStarted","Data":"9a80d23b4c8b6fe71991211e2d0a397dc1e82232460274f530b976c3c469ef56"} Apr 16 16:42:11.923056 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.923018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" event={"ID":"079ad2c9-be6c-443d-bb79-d883f22b2614","Type":"ContainerStarted","Data":"4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb"} Apr 16 16:42:11.924352 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.924321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" event={"ID":"a8ae0449-aec8-4633-997a-7137047bd4ae","Type":"ContainerStarted","Data":"bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22"} Apr 16 16:42:11.925547 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.925517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" event={"ID":"30afbb56-ef59-4826-802f-2efdf8c52792","Type":"ContainerStarted","Data":"5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0"} Apr 16 16:42:11.926639 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.926616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" event={"ID":"84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7","Type":"ContainerStarted","Data":"8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d"} Apr 16 16:42:11.926799 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.926786 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" Apr 16 16:42:11.928183 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.928155 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:42:11.968995 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:11.968899 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podStartSLOduration=2.02434248 podStartE2EDuration="15.968882139s" podCreationTimestamp="2026-04-16 16:41:56 +0000 UTC" firstStartedPulling="2026-04-16 16:41:57.299788359 +0000 UTC m=+1073.022332727" lastFinishedPulling="2026-04-16 16:42:11.244328005 +0000 UTC m=+1086.966872386" observedRunningTime="2026-04-16 16:42:11.968388274 +0000 UTC m=+1087.690932666" watchObservedRunningTime="2026-04-16 16:42:11.968882139 +0000 UTC m=+1087.691426531" Apr 16 16:42:12.930235 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:12.930196 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:42:14.936158 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:14.936119 2578 generic.go:358] "Generic (PLEG): container finished" podID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerID="4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb" exitCode=0 Apr 16 16:42:14.936526 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:14.936191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" event={"ID":"079ad2c9-be6c-443d-bb79-d883f22b2614","Type":"ContainerDied","Data":"4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb"} Apr 16 16:42:15.941102 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:15.941065 2578 generic.go:358] "Generic (PLEG): container finished" podID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerID="bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22" exitCode=0 Apr 16 16:42:15.941532 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:15.941134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" event={"ID":"a8ae0449-aec8-4633-997a-7137047bd4ae","Type":"ContainerDied","Data":"bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22"} Apr 16 16:42:15.942516 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:15.942497 2578 generic.go:358] "Generic (PLEG): container finished" podID="30afbb56-ef59-4826-802f-2efdf8c52792" containerID="5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0" exitCode=0 Apr 16 16:42:15.942584 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:15.942567 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" event={"ID":"30afbb56-ef59-4826-802f-2efdf8c52792","Type":"ContainerDied","Data":"5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0"} Apr 16 16:42:22.930729 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:22.930685 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:42:23.976562 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.976471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" event={"ID":"079ad2c9-be6c-443d-bb79-d883f22b2614","Type":"ContainerStarted","Data":"064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4"} Apr 16 16:42:23.977026 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.976862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:42:23.978571 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.978537 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:42:23.978695 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.978652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" event={"ID":"30afbb56-ef59-4826-802f-2efdf8c52792","Type":"ContainerStarted","Data":"ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8"} Apr 16 16:42:23.978993 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.978970 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:42:23.980241 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.980142 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:42:23.994832 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:23.994774 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podStartSLOduration=1.2880864189999999 podStartE2EDuration="27.994756302s" podCreationTimestamp="2026-04-16 16:41:56 +0000 UTC" firstStartedPulling="2026-04-16 16:41:56.877377244 +0000 UTC m=+1072.599921615" lastFinishedPulling="2026-04-16 16:42:23.584047115 +0000 UTC m=+1099.306591498" observedRunningTime="2026-04-16 16:42:23.993610714 +0000 UTC m=+1099.716155104" watchObservedRunningTime="2026-04-16 16:42:23.994756302 +0000 UTC m=+1099.717300714" Apr 16 16:42:24.012792 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:24.012735 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podStartSLOduration=1.511937569 podStartE2EDuration="27.012713321s" podCreationTimestamp="2026-04-16 16:41:57 +0000 UTC" firstStartedPulling="2026-04-16 16:41:58.10137008 +0000 UTC m=+1073.823914453" lastFinishedPulling="2026-04-16 16:42:23.602145826 +0000 UTC m=+1099.324690205" observedRunningTime="2026-04-16 16:42:24.012129623 +0000 UTC m=+1099.734674024" watchObservedRunningTime="2026-04-16 16:42:24.012713321 +0000 UTC m=+1099.735257712" Apr 16 16:42:24.982772 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:24.982728 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:42:24.983278 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:24.982728 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:42:32.930640 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:32.930590 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:42:34.982777 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:34.982687 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:42:34.983254 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:34.982791 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:42:35.017601 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:35.017562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" event={"ID":"a8ae0449-aec8-4633-997a-7137047bd4ae","Type":"ContainerStarted","Data":"4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8"} Apr 16 16:42:35.017862 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:35.017844 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:42:35.019183 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:35.019153 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:42:35.043137 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:35.043074 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podStartSLOduration=1.734386325 podStartE2EDuration="39.043053339s" podCreationTimestamp="2026-04-16 16:41:56 +0000 UTC" firstStartedPulling="2026-04-16 16:41:57.364234672 +0000 UTC m=+1073.086779040" lastFinishedPulling="2026-04-16 16:42:34.672901681 +0000 UTC m=+1110.395446054" observedRunningTime="2026-04-16 16:42:35.041276981 +0000 UTC m=+1110.763821383" watchObservedRunningTime="2026-04-16 16:42:35.043053339 +0000 UTC m=+1110.765597729" Apr 16 16:42:36.020714 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:36.020678 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:42:42.930317 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:42.930274 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:42:44.982785 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:44.982739 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:42:44.983209 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:44.982954 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:42:46.021146 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:46.021098 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:42:52.931083 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:52.931039 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:42:54.982792 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:54.982743 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:42:54.983222 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:54.982952 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:42:56.021294 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:42:56.021252 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:43:02.930501 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:02.930454 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:43:04.983141 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:04.983102 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:43:04.983536 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:04.983102 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:43:06.021617 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:06.021573 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:43:12.931574 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:12.931545 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" Apr 16 16:43:14.983505 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:14.983459 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:43:14.983907 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:14.983459 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:43:16.020884 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:16.020840 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:43:24.983075 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:24.982993 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 16:43:24.983515 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:24.983000 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 16:43:26.021654 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:26.021609 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:43:31.900581 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:31.900551 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6"] Apr 16 16:43:31.901072 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:31.900772 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" containerID="cri-o://8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d" gracePeriod=30 Apr 16 16:43:31.929981 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:31.929943 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z"] Apr 16 16:43:31.932225 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:31.932195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" Apr 16 16:43:31.943134 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:31.943101 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z"] Apr 16 16:43:31.944444 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:31.944420 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" Apr 16 16:43:32.070812 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.070788 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z"] Apr 16 16:43:32.073184 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:43:32.073158 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod691deef6_3160_4e56_bedb_5a4579b66467.slice/crio-47ec56eae25ba6f36251986e47126d5771bfdcf025afc9ad360b210d8c69013b WatchSource:0}: Error finding container 47ec56eae25ba6f36251986e47126d5771bfdcf025afc9ad360b210d8c69013b: Status 404 returned error can't find the container with id 47ec56eae25ba6f36251986e47126d5771bfdcf025afc9ad360b210d8c69013b Apr 16 16:43:32.183043 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.182955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" event={"ID":"691deef6-3160-4e56-bedb-5a4579b66467","Type":"ContainerStarted","Data":"60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5"} Apr 16 16:43:32.183043 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.182995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" event={"ID":"691deef6-3160-4e56-bedb-5a4579b66467","Type":"ContainerStarted","Data":"47ec56eae25ba6f36251986e47126d5771bfdcf025afc9ad360b210d8c69013b"} Apr 16 16:43:32.183260 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.183119 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" Apr 16 16:43:32.184248 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.184225 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 16:43:32.200240 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.200194 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podStartSLOduration=1.200179768 podStartE2EDuration="1.200179768s" podCreationTimestamp="2026-04-16 16:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:43:32.198429321 +0000 UTC m=+1167.920973712" watchObservedRunningTime="2026-04-16 16:43:32.200179768 +0000 UTC m=+1167.922724157" Apr 16 16:43:32.931362 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:32.931320 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 16:43:33.186135 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:33.186054 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 16:43:34.984096 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:34.984068 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:43:34.984456 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:34.984127 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:43:35.143506 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.143478 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" Apr 16 16:43:35.192896 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.192811 2578 generic.go:358] "Generic (PLEG): container finished" podID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerID="8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d" exitCode=0 Apr 16 16:43:35.192896 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.192877 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" Apr 16 16:43:35.193098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.192893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" event={"ID":"84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7","Type":"ContainerDied","Data":"8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d"} Apr 16 16:43:35.193098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.192960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6" event={"ID":"84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7","Type":"ContainerDied","Data":"597b36ef6b1adbb98961736c189f7e766c0bc6a31a75c61bf1221166d5ed0263"} Apr 16 16:43:35.193098 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.192982 2578 scope.go:117] "RemoveContainer" containerID="8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d" Apr 16 16:43:35.201005 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.200984 2578 scope.go:117] "RemoveContainer" containerID="8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d" Apr 16 16:43:35.201281 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:43:35.201262 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d\": container with ID starting with 8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d not found: ID does not exist" containerID="8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d" Apr 16 16:43:35.201367 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.201289 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d"} err="failed to get container status \"8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d\": rpc error: code = NotFound desc = could not find container \"8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d\": container with ID starting with 8e403441223492976a60c57be8b5d416fcee35edc223152f2b84bcfde4a1bf1d not found: ID does not exist" Apr 16 16:43:35.214823 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.214786 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6"] Apr 16 16:43:35.216998 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:35.216970 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c599-predictor-64946668d-nbsr6"] Apr 16 16:43:36.021253 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:36.021208 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 16:43:36.854289 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:36.854253 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" path="/var/lib/kubelet/pods/84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7/volumes" Apr 16 16:43:43.186156 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:43.186114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 16:43:46.022088 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:46.022061 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:43:53.186984 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:43:53.186907 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 16:44:03.186507 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:03.186454 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 16:44:04.800700 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:04.800674 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:44:04.803323 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:04.803305 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:44:06.539272 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.539234 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4"] Apr 16 16:44:06.539624 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.539568 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" containerID="cri-o://4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8" gracePeriod=30 Apr 16 16:44:06.598075 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.598043 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c"] Apr 16 16:44:06.598436 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.598390 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" containerID="cri-o://ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8" gracePeriod=30 Apr 16 16:44:06.653665 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.653629 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml"] Apr 16 16:44:06.653939 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.653905 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" containerID="cri-o://064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4" gracePeriod=30 Apr 16 16:44:06.718404 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.718372 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb"] Apr 16 16:44:06.718859 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.718844 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" Apr 16 16:44:06.718900 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.718863 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" Apr 16 16:44:06.718983 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.718971 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84927f2c-3dcf-4a35-9cdd-6f8b51ed08d7" containerName="kserve-container" Apr 16 16:44:06.721957 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.721896 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" Apr 16 16:44:06.731167 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.731134 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb"] Apr 16 16:44:06.739993 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.735313 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" Apr 16 16:44:06.862655 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:06.862621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb"] Apr 16 16:44:06.865537 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:44:06.865496 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf842b0b3_f94c_4fa7_8d35_4f3ffdbd8cc5.slice/crio-5732405b1c9f7bfaf602ff819a102f3400241ebc79b88bc0fdb634701ae9a5bf WatchSource:0}: Error finding container 5732405b1c9f7bfaf602ff819a102f3400241ebc79b88bc0fdb634701ae9a5bf: Status 404 returned error can't find the container with id 5732405b1c9f7bfaf602ff819a102f3400241ebc79b88bc0fdb634701ae9a5bf Apr 16 16:44:07.291709 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:07.291675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" event={"ID":"f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5","Type":"ContainerStarted","Data":"d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c"} Apr 16 16:44:07.291709 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:07.291709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" event={"ID":"f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5","Type":"ContainerStarted","Data":"5732405b1c9f7bfaf602ff819a102f3400241ebc79b88bc0fdb634701ae9a5bf"} Apr 16 16:44:07.292079 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:07.291868 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" Apr 16 16:44:07.293054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:07.293028 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 16:44:07.307057 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:07.307007 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podStartSLOduration=1.306989365 podStartE2EDuration="1.306989365s" podCreationTimestamp="2026-04-16 16:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:44:07.30660373 +0000 UTC m=+1203.029148121" watchObservedRunningTime="2026-04-16 16:44:07.306989365 +0000 UTC m=+1203.029533758" Apr 16 16:44:08.294553 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:08.294518 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 16:44:10.788741 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:10.788713 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:44:10.850810 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:10.850778 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8ae0449-aec8-4633-997a-7137047bd4ae-kserve-provision-location\") pod \"a8ae0449-aec8-4633-997a-7137047bd4ae\" (UID: \"a8ae0449-aec8-4633-997a-7137047bd4ae\") " Apr 16 16:44:10.851143 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:10.851119 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ae0449-aec8-4633-997a-7137047bd4ae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8ae0449-aec8-4633-997a-7137047bd4ae" (UID: "a8ae0449-aec8-4633-997a-7137047bd4ae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:10.952253 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:10.952172 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8ae0449-aec8-4633-997a-7137047bd4ae-kserve-provision-location\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:44:11.240550 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.240526 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:44:11.304568 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.304530 2578 generic.go:358] "Generic (PLEG): container finished" podID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerID="4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8" exitCode=0 Apr 16 16:44:11.304764 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.304596 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" Apr 16 16:44:11.304764 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.304596 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" event={"ID":"a8ae0449-aec8-4633-997a-7137047bd4ae","Type":"ContainerDied","Data":"4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8"} Apr 16 16:44:11.304764 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.304694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4" event={"ID":"a8ae0449-aec8-4633-997a-7137047bd4ae","Type":"ContainerDied","Data":"32c0764fe27a780f058898999d071008eb9a989250c480d72a979eaafb36fb67"} Apr 16 16:44:11.304764 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.304712 2578 scope.go:117] "RemoveContainer" containerID="4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8" Apr 16 16:44:11.306082 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.306060 2578 generic.go:358] "Generic (PLEG): container finished" podID="30afbb56-ef59-4826-802f-2efdf8c52792" containerID="ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8" exitCode=0 Apr 16 16:44:11.306190 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.306131 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" Apr 16 16:44:11.306190 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.306166 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" event={"ID":"30afbb56-ef59-4826-802f-2efdf8c52792","Type":"ContainerDied","Data":"ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8"} Apr 16 16:44:11.306270 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.306207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c" event={"ID":"30afbb56-ef59-4826-802f-2efdf8c52792","Type":"ContainerDied","Data":"9a80d23b4c8b6fe71991211e2d0a397dc1e82232460274f530b976c3c469ef56"} Apr 16 16:44:11.312547 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.312527 2578 scope.go:117] "RemoveContainer" containerID="bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22" Apr 16 16:44:11.319472 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.319453 2578 scope.go:117] "RemoveContainer" containerID="4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8" Apr 16 16:44:11.319714 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:44:11.319699 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8\": container with ID starting with 4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8 not found: ID does not exist" containerID="4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8" Apr 16 16:44:11.319762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.319721 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8"} err="failed to get container status \"4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8\": rpc error: code = NotFound desc = could not find container \"4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8\": container with ID starting with 4266b196383f1c0cbc1e1b944558b5516691fb557397e5c954adcd2d67bde7b8 not found: ID does not exist" Apr 16 16:44:11.319762 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.319737 2578 scope.go:117] "RemoveContainer" containerID="bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22" Apr 16 16:44:11.319987 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:44:11.319969 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22\": container with ID starting with bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22 not found: ID does not exist" containerID="bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22" Apr 16 16:44:11.320054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.319992 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22"} err="failed to get container status \"bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22\": rpc error: code = NotFound desc = could not find container \"bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22\": container with ID starting with bb5b27d2436085e971c2d9406a7300cd6beb19e7ea8a86c2392bebc70ad2da22 not found: ID does not exist" Apr 16 16:44:11.320054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.320007 2578 scope.go:117] "RemoveContainer" containerID="ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8" Apr 16 16:44:11.324624 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.324603 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4"] Apr 16 16:44:11.326777 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.326757 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-b6mw4"] Apr 16 16:44:11.327436 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.327423 2578 scope.go:117] "RemoveContainer" containerID="5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0" Apr 16 16:44:11.334104 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.334084 2578 scope.go:117] "RemoveContainer" containerID="ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8" Apr 16 16:44:11.334395 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:44:11.334377 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8\": container with ID starting with ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8 not found: ID does not exist" containerID="ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8" Apr 16 16:44:11.334469 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.334414 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8"} err="failed to get container status \"ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8\": rpc error: code = NotFound desc = could not find container \"ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8\": container with ID starting with ac86c9975a559a9e4e78572d95c6b7c48700126f3b8773fb67832d4b327b5bd8 not found: ID does not exist" Apr 16 16:44:11.334469 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.334447 2578 scope.go:117] "RemoveContainer" containerID="5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0" Apr 16 16:44:11.334733 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:44:11.334718 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0\": container with ID starting with 5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0 not found: ID does not exist" containerID="5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0" Apr 16 16:44:11.334787 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.334741 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0"} err="failed to get container status \"5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0\": rpc error: code = NotFound desc = could not find container \"5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0\": container with ID starting with 5540a188ce0e94f12c0ae29a3f99907889f6d2ef81ab2f78a86fc998b6eab4e0 not found: ID does not exist" Apr 16 16:44:11.355097 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.355060 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30afbb56-ef59-4826-802f-2efdf8c52792-kserve-provision-location\") pod \"30afbb56-ef59-4826-802f-2efdf8c52792\" (UID: \"30afbb56-ef59-4826-802f-2efdf8c52792\") " Apr 16 16:44:11.355390 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.355371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30afbb56-ef59-4826-802f-2efdf8c52792-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30afbb56-ef59-4826-802f-2efdf8c52792" (UID: "30afbb56-ef59-4826-802f-2efdf8c52792"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:11.456130 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.456093 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30afbb56-ef59-4826-802f-2efdf8c52792-kserve-provision-location\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:44:11.630206 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.630176 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c"] Apr 16 16:44:11.636977 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.636948 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-2p67c"] Apr 16 16:44:11.683196 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.683168 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:44:11.757802 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.757765 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/079ad2c9-be6c-443d-bb79-d883f22b2614-kserve-provision-location\") pod \"079ad2c9-be6c-443d-bb79-d883f22b2614\" (UID: \"079ad2c9-be6c-443d-bb79-d883f22b2614\") " Apr 16 16:44:11.760948 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.758361 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/079ad2c9-be6c-443d-bb79-d883f22b2614-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "079ad2c9-be6c-443d-bb79-d883f22b2614" (UID: "079ad2c9-be6c-443d-bb79-d883f22b2614"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:11.859565 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:11.859512 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/079ad2c9-be6c-443d-bb79-d883f22b2614-kserve-provision-location\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 16:44:12.310341 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.310309 2578 generic.go:358] "Generic (PLEG): container finished" podID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerID="064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4" exitCode=0 Apr 16 16:44:12.310525 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.310379 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" event={"ID":"079ad2c9-be6c-443d-bb79-d883f22b2614","Type":"ContainerDied","Data":"064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4"} Apr 16 16:44:12.310525 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.310404 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" Apr 16 16:44:12.310525 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.310419 2578 scope.go:117] "RemoveContainer" containerID="064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4" Apr 16 16:44:12.310686 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.310408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml" event={"ID":"079ad2c9-be6c-443d-bb79-d883f22b2614","Type":"ContainerDied","Data":"d50345ea40ca4e493265b73094c61273e14db0e53787561fdb45a3f1ee799e6c"} Apr 16 16:44:12.318138 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.318118 2578 scope.go:117] "RemoveContainer" containerID="4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb" Apr 16 16:44:12.324869 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.324851 2578 scope.go:117] "RemoveContainer" containerID="064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4" Apr 16 16:44:12.325116 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:44:12.325100 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4\": container with ID starting with 064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4 not found: ID does not exist" containerID="064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4" Apr 16 16:44:12.325173 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.325124 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4"} err="failed to get container status \"064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4\": rpc error: code = NotFound desc = could not find container \"064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4\": container with ID starting with 064c144862f1bba259d36069abcb765a7dc8cf583f147ff7b5be8fba43b9b5f4 not found: ID does not exist" Apr 16 16:44:12.325173 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.325140 2578 scope.go:117] "RemoveContainer" containerID="4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb" Apr 16 16:44:12.325337 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:44:12.325315 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb\": container with ID starting with 4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb not found: ID does not exist" containerID="4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb" Apr 16 16:44:12.325388 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.325340 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb"} err="failed to get container status \"4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb\": rpc error: code = NotFound desc = could not find container \"4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb\": container with ID starting with 4b3f6d3d9ca0d4e18782f5af3510e26df177f61364e870a4d5445d9f6e3d4ecb not found: ID does not exist" Apr 16 16:44:12.331705 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.331684 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml"] Apr 16 16:44:12.332910 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.332890 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-ws2ml"] Apr 16 16:44:12.854669 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.854638 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" path="/var/lib/kubelet/pods/079ad2c9-be6c-443d-bb79-d883f22b2614/volumes" Apr 16 16:44:12.854997 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.854985 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" path="/var/lib/kubelet/pods/30afbb56-ef59-4826-802f-2efdf8c52792/volumes" Apr 16 16:44:12.855300 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:12.855289 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" path="/var/lib/kubelet/pods/a8ae0449-aec8-4633-997a-7137047bd4ae/volumes" Apr 16 16:44:13.187156 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:13.187062 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 16:44:18.294857 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:18.294815 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 16:44:23.187482 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:23.187451 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" Apr 16 16:44:28.295025 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:28.294909 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 16:44:38.295424 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:38.295382 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 16:44:48.294893 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:48.294845 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 16:44:58.296162 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:44:58.296072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" Apr 16 16:49:04.820482 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:49:04.820362 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:49:04.823803 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:49:04.823787 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:52:56.918596 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:56.918516 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z"] Apr 16 16:52:56.919177 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:56.918751 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" containerID="cri-o://60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5" gracePeriod=30 Apr 16 16:52:57.053641 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053611 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn"] Apr 16 16:52:57.053943 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053910 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="storage-initializer" Apr 16 16:52:57.053943 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053935 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="storage-initializer" Apr 16 16:52:57.053943 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053942 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="storage-initializer" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053948 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="storage-initializer" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053959 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053964 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053973 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053978 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053986 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053991 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.053997 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="storage-initializer" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.054002 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="storage-initializer" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.054042 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="30afbb56-ef59-4826-802f-2efdf8c52792" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.054050 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8ae0449-aec8-4633-997a-7137047bd4ae" containerName="kserve-container" Apr 16 16:52:57.054054 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.054057 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="079ad2c9-be6c-443d-bb79-d883f22b2614" containerName="kserve-container" Apr 16 16:52:57.055993 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.055980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" Apr 16 16:52:57.060752 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.060721 2578 status_manager.go:895] "Failed to get status for pod" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" err="pods \"error-404-isvc-bab33-predictor-86bd4f56b4-97wxn\" is forbidden: User \"system:node:ip-10-0-128-64.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-64.ec2.internal' and this object" Apr 16 16:52:57.064838 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.064821 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" Apr 16 16:52:57.087713 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.087679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn"] Apr 16 16:52:57.186008 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.185982 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn"] Apr 16 16:52:57.188382 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:52:57.188349 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7004b4b1_b124_47f6_9777_6d7ea8068d1e.slice/crio-07babd12f7c407f518f8ca30e1e58d7bdaf7b32a1de922425d705c4656a77b3a WatchSource:0}: Error finding container 07babd12f7c407f518f8ca30e1e58d7bdaf7b32a1de922425d705c4656a77b3a: Status 404 returned error can't find the container with id 07babd12f7c407f518f8ca30e1e58d7bdaf7b32a1de922425d705c4656a77b3a Apr 16 16:52:57.190302 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.190284 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:52:57.737983 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.737941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" event={"ID":"7004b4b1-b124-47f6-9777-6d7ea8068d1e","Type":"ContainerStarted","Data":"38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5"} Apr 16 16:52:57.737983 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.737979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" event={"ID":"7004b4b1-b124-47f6-9777-6d7ea8068d1e","Type":"ContainerStarted","Data":"07babd12f7c407f518f8ca30e1e58d7bdaf7b32a1de922425d705c4656a77b3a"} Apr 16 16:52:57.738242 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.738165 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" Apr 16 16:52:57.739412 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.739379 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:52:57.754581 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:57.754535 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podStartSLOduration=0.754521195 podStartE2EDuration="754.521195ms" podCreationTimestamp="2026-04-16 16:52:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:52:57.753800752 +0000 UTC m=+1733.476345142" watchObservedRunningTime="2026-04-16 16:52:57.754521195 +0000 UTC m=+1733.477065584" Apr 16 16:52:58.741981 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:52:58.741939 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:53:00.059526 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.059505 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" Apr 16 16:53:00.747896 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.747862 2578 generic.go:358] "Generic (PLEG): container finished" podID="691deef6-3160-4e56-bedb-5a4579b66467" containerID="60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5" exitCode=0 Apr 16 16:53:00.748070 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.747939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" event={"ID":"691deef6-3160-4e56-bedb-5a4579b66467","Type":"ContainerDied","Data":"60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5"} Apr 16 16:53:00.748070 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.747943 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" Apr 16 16:53:00.748070 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.747967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z" event={"ID":"691deef6-3160-4e56-bedb-5a4579b66467","Type":"ContainerDied","Data":"47ec56eae25ba6f36251986e47126d5771bfdcf025afc9ad360b210d8c69013b"} Apr 16 16:53:00.748070 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.747982 2578 scope.go:117] "RemoveContainer" containerID="60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5" Apr 16 16:53:00.756063 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.756037 2578 scope.go:117] "RemoveContainer" containerID="60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5" Apr 16 16:53:00.756305 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:53:00.756284 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5\": container with ID starting with 60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5 not found: ID does not exist" containerID="60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5" Apr 16 16:53:00.756377 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.756316 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5"} err="failed to get container status \"60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5\": rpc error: code = NotFound desc = could not find container \"60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5\": container with ID starting with 60f561e1212d4081bf4f9a96913a0a40f805b4b94544604e22bccb455e0285a5 not found: ID does not exist" Apr 16 16:53:00.769969 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.769945 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z"] Apr 16 16:53:00.776805 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.776786 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df178-predictor-58dc95b6f4-rjs9z"] Apr 16 16:53:00.854231 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:00.854204 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691deef6-3160-4e56-bedb-5a4579b66467" path="/var/lib/kubelet/pods/691deef6-3160-4e56-bedb-5a4579b66467/volumes" Apr 16 16:53:08.741916 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:08.741872 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:53:18.742595 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:18.742543 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:53:28.742093 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:28.742054 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:53:31.770375 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.770336 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb"] Apr 16 16:53:31.770822 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.770732 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" containerID="cri-o://d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c" gracePeriod=30 Apr 16 16:53:31.793463 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.793432 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv"] Apr 16 16:53:31.793839 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.793822 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" Apr 16 16:53:31.793839 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.793841 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" Apr 16 16:53:31.793998 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.793886 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="691deef6-3160-4e56-bedb-5a4579b66467" containerName="kserve-container" Apr 16 16:53:31.798153 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.798135 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" Apr 16 16:53:31.804129 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.804106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv"] Apr 16 16:53:31.808601 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.808579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" Apr 16 16:53:31.928469 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:31.928436 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv"] Apr 16 16:53:31.931845 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:53:31.931818 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fb00e18_8ad3_425d_bbf2_46722da02dd7.slice/crio-e1cca482822d1211189024dc54c153860a5ce0f652f66bd8a77de097c01a206a WatchSource:0}: Error finding container e1cca482822d1211189024dc54c153860a5ce0f652f66bd8a77de097c01a206a: Status 404 returned error can't find the container with id e1cca482822d1211189024dc54c153860a5ce0f652f66bd8a77de097c01a206a Apr 16 16:53:32.844347 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:32.844312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" event={"ID":"3fb00e18-8ad3-425d-bbf2-46722da02dd7","Type":"ContainerStarted","Data":"120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d"} Apr 16 16:53:32.844347 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:32.844350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" event={"ID":"3fb00e18-8ad3-425d-bbf2-46722da02dd7","Type":"ContainerStarted","Data":"e1cca482822d1211189024dc54c153860a5ce0f652f66bd8a77de097c01a206a"} Apr 16 16:53:32.844820 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:32.844604 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" Apr 16 16:53:32.845808 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:32.845777 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:53:32.866008 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:32.865967 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podStartSLOduration=1.865954346 podStartE2EDuration="1.865954346s" podCreationTimestamp="2026-04-16 16:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:53:32.864053093 +0000 UTC m=+1768.586597482" watchObservedRunningTime="2026-04-16 16:53:32.865954346 +0000 UTC m=+1768.588498735" Apr 16 16:53:33.847272 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:33.847233 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:53:35.314209 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.314190 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" Apr 16 16:53:35.854065 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.854032 2578 generic.go:358] "Generic (PLEG): container finished" podID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerID="d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c" exitCode=0 Apr 16 16:53:35.854250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.854074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" event={"ID":"f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5","Type":"ContainerDied","Data":"d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c"} Apr 16 16:53:35.854250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.854088 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" Apr 16 16:53:35.854250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.854103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb" event={"ID":"f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5","Type":"ContainerDied","Data":"5732405b1c9f7bfaf602ff819a102f3400241ebc79b88bc0fdb634701ae9a5bf"} Apr 16 16:53:35.854250 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.854118 2578 scope.go:117] "RemoveContainer" containerID="d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c" Apr 16 16:53:35.862197 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.862179 2578 scope.go:117] "RemoveContainer" containerID="d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c" Apr 16 16:53:35.862433 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:53:35.862416 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c\": container with ID starting with d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c not found: ID does not exist" containerID="d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c" Apr 16 16:53:35.862496 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.862442 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c"} err="failed to get container status \"d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c\": rpc error: code = NotFound desc = could not find container \"d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c\": container with ID starting with d5ae13843211e9b7955100b0bfcd8d05feacf66ddb8d4b3dcf8343ee07c4262c not found: ID does not exist" Apr 16 16:53:35.876336 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.876311 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb"] Apr 16 16:53:35.879467 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:35.879445 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39f4a-predictor-8fc685496-rwzlb"] Apr 16 16:53:36.855662 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:36.855620 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" path="/var/lib/kubelet/pods/f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5/volumes" Apr 16 16:53:38.742749 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:38.742683 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:53:43.848129 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:43.848083 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:53:48.743010 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:48.742973 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" Apr 16 16:53:53.847888 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:53:53.847789 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:54:03.848120 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:03.848074 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:54:04.841525 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:04.841400 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:54:04.849463 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:04.845357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:54:13.848014 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:13.847966 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:54:17.408502 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.408466 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn"] Apr 16 16:54:17.408980 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.408711 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" containerID="cri-o://38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5" gracePeriod=30 Apr 16 16:54:17.425314 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.425282 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8"] Apr 16 16:54:17.425606 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.425591 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" Apr 16 16:54:17.425606 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.425606 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" Apr 16 16:54:17.425714 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.425657 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f842b0b3-f94c-4fa7-8d35-4f3ffdbd8cc5" containerName="kserve-container" Apr 16 16:54:17.428427 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.428407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" Apr 16 16:54:17.437980 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.437953 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" Apr 16 16:54:17.448611 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.448585 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8"] Apr 16 16:54:17.570972 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.570940 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8"] Apr 16 16:54:17.574254 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:54:17.574218 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ce8d3e_23ce_46ec_983b_495828ebd970.slice/crio-202e66e28b4a39898dcb95b56e47009471562c028d126bfbbc9f80611824b550 WatchSource:0}: Error finding container 202e66e28b4a39898dcb95b56e47009471562c028d126bfbbc9f80611824b550: Status 404 returned error can't find the container with id 202e66e28b4a39898dcb95b56e47009471562c028d126bfbbc9f80611824b550 Apr 16 16:54:17.975300 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.975265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" event={"ID":"e8ce8d3e-23ce-46ec-983b-495828ebd970","Type":"ContainerStarted","Data":"f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3"} Apr 16 16:54:17.975300 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.975302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" event={"ID":"e8ce8d3e-23ce-46ec-983b-495828ebd970","Type":"ContainerStarted","Data":"202e66e28b4a39898dcb95b56e47009471562c028d126bfbbc9f80611824b550"} Apr 16 16:54:17.975510 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.975493 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" Apr 16 16:54:17.976738 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.976709 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 16:54:17.992940 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:17.992880 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podStartSLOduration=0.992867165 podStartE2EDuration="992.867165ms" podCreationTimestamp="2026-04-16 16:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:54:17.992227969 +0000 UTC m=+1813.714772360" watchObservedRunningTime="2026-04-16 16:54:17.992867165 +0000 UTC m=+1813.715411555" Apr 16 16:54:18.742767 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:18.742721 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 16:54:18.978235 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:18.978196 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 16:54:20.449800 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.449775 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" Apr 16 16:54:20.983680 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.983651 2578 generic.go:358] "Generic (PLEG): container finished" podID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerID="38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5" exitCode=0 Apr 16 16:54:20.983829 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.983710 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" Apr 16 16:54:20.983829 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.983728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" event={"ID":"7004b4b1-b124-47f6-9777-6d7ea8068d1e","Type":"ContainerDied","Data":"38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5"} Apr 16 16:54:20.983829 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.983767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn" event={"ID":"7004b4b1-b124-47f6-9777-6d7ea8068d1e","Type":"ContainerDied","Data":"07babd12f7c407f518f8ca30e1e58d7bdaf7b32a1de922425d705c4656a77b3a"} Apr 16 16:54:20.983829 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.983786 2578 scope.go:117] "RemoveContainer" containerID="38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5" Apr 16 16:54:20.991402 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.991389 2578 scope.go:117] "RemoveContainer" containerID="38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5" Apr 16 16:54:20.991630 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:54:20.991614 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5\": container with ID starting with 38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5 not found: ID does not exist" containerID="38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5" Apr 16 16:54:20.991677 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.991636 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5"} err="failed to get container status \"38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5\": rpc error: code = NotFound desc = could not find container \"38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5\": container with ID starting with 38762b08f55e6ab4e2f1aa4ba9a151aacad2b7426cdf78cd4b37bc45f404bdb5 not found: ID does not exist" Apr 16 16:54:20.999138 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:20.999117 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn"] Apr 16 16:54:21.002900 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:21.002881 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bab33-predictor-86bd4f56b4-97wxn"] Apr 16 16:54:22.853439 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:22.853404 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" path="/var/lib/kubelet/pods/7004b4b1-b124-47f6-9777-6d7ea8068d1e/volumes" Apr 16 16:54:23.848693 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:23.848663 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" Apr 16 16:54:28.978673 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:28.978632 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 16:54:38.978632 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:38.978584 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 16:54:48.978272 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:48.978227 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 16:54:52.020490 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.020455 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv"] Apr 16 16:54:52.020969 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.020674 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" containerID="cri-o://120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d" gracePeriod=30 Apr 16 16:54:52.098940 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.098893 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm"] Apr 16 16:54:52.099256 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.099240 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" Apr 16 16:54:52.099299 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.099258 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" Apr 16 16:54:52.099337 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.099317 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7004b4b1-b124-47f6-9777-6d7ea8068d1e" containerName="kserve-container" Apr 16 16:54:52.102405 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.102386 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" Apr 16 16:54:52.112573 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.112547 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm"] Apr 16 16:54:52.112917 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.112903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" Apr 16 16:54:52.237157 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:52.237118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm"] Apr 16 16:54:52.240636 ip-10-0-128-64 kubenswrapper[2578]: W0416 16:54:52.240602 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e5e949e_dee6_4211_b47b_6d48f6d2876b.slice/crio-223062b6b35187fbf13b231f86da17c5b5b2194687eeb53bd07f3e26eb8b4ca2 WatchSource:0}: Error finding container 223062b6b35187fbf13b231f86da17c5b5b2194687eeb53bd07f3e26eb8b4ca2: Status 404 returned error can't find the container with id 223062b6b35187fbf13b231f86da17c5b5b2194687eeb53bd07f3e26eb8b4ca2 Apr 16 16:54:53.075206 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:53.075167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" event={"ID":"0e5e949e-dee6-4211-b47b-6d48f6d2876b","Type":"ContainerStarted","Data":"f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6"} Apr 16 16:54:53.075206 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:53.075206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" event={"ID":"0e5e949e-dee6-4211-b47b-6d48f6d2876b","Type":"ContainerStarted","Data":"223062b6b35187fbf13b231f86da17c5b5b2194687eeb53bd07f3e26eb8b4ca2"} Apr 16 16:54:53.075643 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:53.075393 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" Apr 16 16:54:53.076591 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:53.076568 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 16:54:53.097694 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:53.097644 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podStartSLOduration=1.09762759 podStartE2EDuration="1.09762759s" podCreationTimestamp="2026-04-16 16:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:54:53.096099912 +0000 UTC m=+1848.818644302" watchObservedRunningTime="2026-04-16 16:54:53.09762759 +0000 UTC m=+1848.820171979" Apr 16 16:54:53.847887 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:53.847843 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 16:54:54.078194 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:54.078157 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 16:54:55.258885 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:55.258860 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" Apr 16 16:54:56.084266 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.084225 2578 generic.go:358] "Generic (PLEG): container finished" podID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerID="120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d" exitCode=0 Apr 16 16:54:56.084434 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.084280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" event={"ID":"3fb00e18-8ad3-425d-bbf2-46722da02dd7","Type":"ContainerDied","Data":"120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d"} Apr 16 16:54:56.084434 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.084286 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" Apr 16 16:54:56.084434 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.084309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv" event={"ID":"3fb00e18-8ad3-425d-bbf2-46722da02dd7","Type":"ContainerDied","Data":"e1cca482822d1211189024dc54c153860a5ce0f652f66bd8a77de097c01a206a"} Apr 16 16:54:56.084434 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.084329 2578 scope.go:117] "RemoveContainer" containerID="120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d" Apr 16 16:54:56.092653 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.092633 2578 scope.go:117] "RemoveContainer" containerID="120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d" Apr 16 16:54:56.092958 ip-10-0-128-64 kubenswrapper[2578]: E0416 16:54:56.092932 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d\": container with ID starting with 120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d not found: ID does not exist" containerID="120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d" Apr 16 16:54:56.093056 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.092963 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d"} err="failed to get container status \"120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d\": rpc error: code = NotFound desc = could not find container \"120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d\": container with ID starting with 120df0026e61360ee7d0e58d3ee07f9e29053ac8bd7d98e78d3377e49a38f93d not found: ID does not exist" Apr 16 16:54:56.105019 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.104993 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv"] Apr 16 16:54:56.108259 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.108238 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04ea1-predictor-69944bdbf6-kr8wv"] Apr 16 16:54:56.854072 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:56.854031 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" path="/var/lib/kubelet/pods/3fb00e18-8ad3-425d-bbf2-46722da02dd7/volumes" Apr 16 16:54:58.978446 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:54:58.978395 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 16:55:04.078527 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:55:04.078473 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 16:55:08.979826 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:55:08.979791 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" Apr 16 16:55:14.079131 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:55:14.079082 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 16:55:24.079290 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:55:24.079204 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 16:55:34.078360 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:55:34.078313 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 16:55:44.080018 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:55:44.079985 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" Apr 16 16:59:04.861595 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:59:04.861480 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 16:59:04.865763 ip-10-0-128-64 kubenswrapper[2578]: I0416 16:59:04.865745 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:03:42.033086 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.033046 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8"] Apr 16 17:03:42.033635 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.033301 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" containerID="cri-o://f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3" gracePeriod=30 Apr 16 17:03:42.133675 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.133635 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4"] Apr 16 17:03:42.134213 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.134193 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" Apr 16 17:03:42.134291 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.134216 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" Apr 16 17:03:42.134291 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.134286 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fb00e18-8ad3-425d-bbf2-46722da02dd7" containerName="kserve-container" Apr 16 17:03:42.137340 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.137315 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" Apr 16 17:03:42.145982 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.145956 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4"] Apr 16 17:03:42.149038 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.149016 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" Apr 16 17:03:42.270953 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.270902 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4"] Apr 16 17:03:42.273821 ip-10-0-128-64 kubenswrapper[2578]: W0416 17:03:42.273789 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ba8407_fa0f_4b26_9596_912c8f91a37b.slice/crio-6b0bd8b11200408f8007b4c0be02e7e31c2f9377c9dcac4c770213b5b0b6ad46 WatchSource:0}: Error finding container 6b0bd8b11200408f8007b4c0be02e7e31c2f9377c9dcac4c770213b5b0b6ad46: Status 404 returned error can't find the container with id 6b0bd8b11200408f8007b4c0be02e7e31c2f9377c9dcac4c770213b5b0b6ad46 Apr 16 17:03:42.275598 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.275582 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:03:42.547533 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.547450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" event={"ID":"e6ba8407-fa0f-4b26-9596-912c8f91a37b","Type":"ContainerStarted","Data":"8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c"} Apr 16 17:03:42.547533 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.547485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" event={"ID":"e6ba8407-fa0f-4b26-9596-912c8f91a37b","Type":"ContainerStarted","Data":"6b0bd8b11200408f8007b4c0be02e7e31c2f9377c9dcac4c770213b5b0b6ad46"} Apr 16 17:03:42.547712 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.547663 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" Apr 16 17:03:42.549006 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.548980 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:03:42.564228 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:42.564175 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podStartSLOduration=0.564160647 podStartE2EDuration="564.160647ms" podCreationTimestamp="2026-04-16 17:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:03:42.563468608 +0000 UTC m=+2378.286013014" watchObservedRunningTime="2026-04-16 17:03:42.564160647 +0000 UTC m=+2378.286705036" Apr 16 17:03:43.550704 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:43.550668 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:03:45.176870 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.176847 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" Apr 16 17:03:45.558068 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.558038 2578 generic.go:358] "Generic (PLEG): container finished" podID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerID="f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3" exitCode=0 Apr 16 17:03:45.558241 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.558098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" event={"ID":"e8ce8d3e-23ce-46ec-983b-495828ebd970","Type":"ContainerDied","Data":"f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3"} Apr 16 17:03:45.558241 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.558102 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" Apr 16 17:03:45.558241 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.558124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8" event={"ID":"e8ce8d3e-23ce-46ec-983b-495828ebd970","Type":"ContainerDied","Data":"202e66e28b4a39898dcb95b56e47009471562c028d126bfbbc9f80611824b550"} Apr 16 17:03:45.558241 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.558139 2578 scope.go:117] "RemoveContainer" containerID="f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3" Apr 16 17:03:45.566067 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.566052 2578 scope.go:117] "RemoveContainer" containerID="f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3" Apr 16 17:03:45.566351 ip-10-0-128-64 kubenswrapper[2578]: E0416 17:03:45.566330 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3\": container with ID starting with f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3 not found: ID does not exist" containerID="f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3" Apr 16 17:03:45.566421 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.566358 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3"} err="failed to get container status \"f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3\": rpc error: code = NotFound desc = could not find container \"f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3\": container with ID starting with f0e0d868df6186f61f9baf0c4cea99edd4e6b43d3d6332fd1caef6fe537048a3 not found: ID does not exist" Apr 16 17:03:45.577131 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.577103 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8"] Apr 16 17:03:45.581083 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:45.581059 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-da571-predictor-586f5c8454-pfzr8"] Apr 16 17:03:46.854049 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:46.854017 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" path="/var/lib/kubelet/pods/e8ce8d3e-23ce-46ec-983b-495828ebd970/volumes" Apr 16 17:03:53.550850 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:03:53.550808 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:04:03.551706 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:03.551660 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:04:04.880808 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:04.880698 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:04:04.886326 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:04.885467 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:04:13.551503 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:13.551459 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:04:16.964021 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.963992 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm"] Apr 16 17:04:16.964431 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.964215 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" containerID="cri-o://f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6" gracePeriod=30 Apr 16 17:04:16.993916 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.993885 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h"] Apr 16 17:04:16.994207 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.994194 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" Apr 16 17:04:16.994261 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.994208 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" Apr 16 17:04:16.994297 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.994260 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8ce8d3e-23ce-46ec-983b-495828ebd970" containerName="kserve-container" Apr 16 17:04:16.997289 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:16.997273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" Apr 16 17:04:17.004219 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.004195 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h"] Apr 16 17:04:17.007829 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.007807 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" Apr 16 17:04:17.129563 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.129461 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h"] Apr 16 17:04:17.132153 ip-10-0-128-64 kubenswrapper[2578]: W0416 17:04:17.132123 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd3d6ac_f4fe_4a34_9eef_e99722e2e59b.slice/crio-cd099bc3056102982112c7674713ac29e3aa2d159d77449b435392c1afe34a0e WatchSource:0}: Error finding container cd099bc3056102982112c7674713ac29e3aa2d159d77449b435392c1afe34a0e: Status 404 returned error can't find the container with id cd099bc3056102982112c7674713ac29e3aa2d159d77449b435392c1afe34a0e Apr 16 17:04:17.656421 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.656390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" event={"ID":"0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b","Type":"ContainerStarted","Data":"c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5"} Apr 16 17:04:17.656421 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.656426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" event={"ID":"0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b","Type":"ContainerStarted","Data":"cd099bc3056102982112c7674713ac29e3aa2d159d77449b435392c1afe34a0e"} Apr 16 17:04:17.656659 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.656600 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" Apr 16 17:04:17.657813 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.657786 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:04:17.676446 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:17.673693 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podStartSLOduration=1.6736759270000001 podStartE2EDuration="1.673675927s" podCreationTimestamp="2026-04-16 17:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:04:17.670715353 +0000 UTC m=+2413.393259754" watchObservedRunningTime="2026-04-16 17:04:17.673675927 +0000 UTC m=+2413.396220319" Apr 16 17:04:18.659892 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:18.659852 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:04:20.103347 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.103327 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" Apr 16 17:04:20.666120 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.666088 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerID="f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6" exitCode=0 Apr 16 17:04:20.666326 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.666143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" event={"ID":"0e5e949e-dee6-4211-b47b-6d48f6d2876b","Type":"ContainerDied","Data":"f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6"} Apr 16 17:04:20.666326 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.666161 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" Apr 16 17:04:20.666326 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.666178 2578 scope.go:117] "RemoveContainer" containerID="f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6" Apr 16 17:04:20.666326 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.666168 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm" event={"ID":"0e5e949e-dee6-4211-b47b-6d48f6d2876b","Type":"ContainerDied","Data":"223062b6b35187fbf13b231f86da17c5b5b2194687eeb53bd07f3e26eb8b4ca2"} Apr 16 17:04:20.674629 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.674613 2578 scope.go:117] "RemoveContainer" containerID="f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6" Apr 16 17:04:20.674890 ip-10-0-128-64 kubenswrapper[2578]: E0416 17:04:20.674872 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6\": container with ID starting with f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6 not found: ID does not exist" containerID="f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6" Apr 16 17:04:20.674973 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.674899 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6"} err="failed to get container status \"f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6\": rpc error: code = NotFound desc = could not find container \"f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6\": container with ID starting with f07be751c06b95b1d46164cba2218cd5d4201247630fa539e3de69a9c7730ac6 not found: ID does not exist" Apr 16 17:04:20.685557 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.685536 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm"] Apr 16 17:04:20.688774 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.688748 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-64f8d-predictor-594459c94b-lb4vm"] Apr 16 17:04:20.854662 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:20.854627 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" path="/var/lib/kubelet/pods/0e5e949e-dee6-4211-b47b-6d48f6d2876b/volumes" Apr 16 17:04:23.551816 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:23.551777 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:04:28.660068 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:28.660028 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:04:33.552062 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:33.552029 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" Apr 16 17:04:38.660308 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:38.660260 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:04:48.660154 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:48.660112 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:04:58.660598 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:04:58.660555 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:05:02.384128 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.384097 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4"] Apr 16 17:05:02.384557 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.384337 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" containerID="cri-o://8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c" gracePeriod=30 Apr 16 17:05:02.418261 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.418233 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w"] Apr 16 17:05:02.418529 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.418517 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" Apr 16 17:05:02.418529 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.418530 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" Apr 16 17:05:02.418608 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.418595 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e5e949e-dee6-4211-b47b-6d48f6d2876b" containerName="kserve-container" Apr 16 17:05:02.421397 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.421381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" Apr 16 17:05:02.429828 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.429805 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w"] Apr 16 17:05:02.430956 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.430937 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" Apr 16 17:05:02.557356 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.557327 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w"] Apr 16 17:05:02.559832 ip-10-0-128-64 kubenswrapper[2578]: W0416 17:05:02.559805 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac6f9720_1c2e_45a3_9966_cfec5f0c30ca.slice/crio-05cc6b48c91c53c14f079930a132763b34e309e4666228a111c0892f1d92a6c5 WatchSource:0}: Error finding container 05cc6b48c91c53c14f079930a132763b34e309e4666228a111c0892f1d92a6c5: Status 404 returned error can't find the container with id 05cc6b48c91c53c14f079930a132763b34e309e4666228a111c0892f1d92a6c5 Apr 16 17:05:02.784636 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.784598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" event={"ID":"ac6f9720-1c2e-45a3-9966-cfec5f0c30ca","Type":"ContainerStarted","Data":"2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83"} Apr 16 17:05:02.784636 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.784639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" event={"ID":"ac6f9720-1c2e-45a3-9966-cfec5f0c30ca","Type":"ContainerStarted","Data":"05cc6b48c91c53c14f079930a132763b34e309e4666228a111c0892f1d92a6c5"} Apr 16 17:05:02.784931 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.784885 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" Apr 16 17:05:02.786069 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.786040 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 17:05:02.799972 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:02.799904 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podStartSLOduration=0.799890925 podStartE2EDuration="799.890925ms" podCreationTimestamp="2026-04-16 17:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:05:02.798899652 +0000 UTC m=+2458.521444042" watchObservedRunningTime="2026-04-16 17:05:02.799890925 +0000 UTC m=+2458.522435314" Apr 16 17:05:03.551729 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:03.551685 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 17:05:03.788480 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:03.788446 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 17:05:05.324335 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.324315 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" Apr 16 17:05:05.797235 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.797201 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerID="8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c" exitCode=0 Apr 16 17:05:05.797403 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.797258 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" Apr 16 17:05:05.797403 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.797281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" event={"ID":"e6ba8407-fa0f-4b26-9596-912c8f91a37b","Type":"ContainerDied","Data":"8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c"} Apr 16 17:05:05.797403 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.797318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4" event={"ID":"e6ba8407-fa0f-4b26-9596-912c8f91a37b","Type":"ContainerDied","Data":"6b0bd8b11200408f8007b4c0be02e7e31c2f9377c9dcac4c770213b5b0b6ad46"} Apr 16 17:05:05.797403 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.797339 2578 scope.go:117] "RemoveContainer" containerID="8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c" Apr 16 17:05:05.805536 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.805519 2578 scope.go:117] "RemoveContainer" containerID="8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c" Apr 16 17:05:05.805762 ip-10-0-128-64 kubenswrapper[2578]: E0416 17:05:05.805744 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c\": container with ID starting with 8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c not found: ID does not exist" containerID="8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c" Apr 16 17:05:05.805806 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.805770 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c"} err="failed to get container status \"8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c\": rpc error: code = NotFound desc = could not find container \"8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c\": container with ID starting with 8d77679bb36e8de1248b2766cb96d5383f84f57a8ed7f1cdcddae727f1c9a37c not found: ID does not exist" Apr 16 17:05:05.817351 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.817330 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4"] Apr 16 17:05:05.821025 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:05.821006 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df533-predictor-84d8c6bb56-gqkw4"] Apr 16 17:05:06.853585 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:06.853550 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" path="/var/lib/kubelet/pods/e6ba8407-fa0f-4b26-9596-912c8f91a37b/volumes" Apr 16 17:05:08.661628 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:08.661594 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" Apr 16 17:05:13.788817 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:13.788777 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 17:05:23.788661 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:23.788615 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 17:05:33.789083 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:33.789043 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 17:05:43.789036 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:43.788993 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 17:05:53.790163 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:05:53.790085 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" Apr 16 17:09:04.899419 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:09:04.899297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:09:04.904898 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:09:04.904878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:14:04.918460 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:04.918340 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:14:04.925086 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:04.925068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:14:27.127225 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:27.127193 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w"] Apr 16 17:14:27.127723 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:27.127433 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" containerID="cri-o://2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83" gracePeriod=30 Apr 16 17:14:30.167842 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.167821 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" Apr 16 17:14:30.368168 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.368134 2578 generic.go:358] "Generic (PLEG): container finished" podID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerID="2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83" exitCode=0 Apr 16 17:14:30.368331 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.368205 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" Apr 16 17:14:30.368331 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.368199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" event={"ID":"ac6f9720-1c2e-45a3-9966-cfec5f0c30ca","Type":"ContainerDied","Data":"2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83"} Apr 16 17:14:30.368331 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.368241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w" event={"ID":"ac6f9720-1c2e-45a3-9966-cfec5f0c30ca","Type":"ContainerDied","Data":"05cc6b48c91c53c14f079930a132763b34e309e4666228a111c0892f1d92a6c5"} Apr 16 17:14:30.368331 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.368256 2578 scope.go:117] "RemoveContainer" containerID="2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83" Apr 16 17:14:30.375861 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.375844 2578 scope.go:117] "RemoveContainer" containerID="2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83" Apr 16 17:14:30.376180 ip-10-0-128-64 kubenswrapper[2578]: E0416 17:14:30.376153 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83\": container with ID starting with 2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83 not found: ID does not exist" containerID="2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83" Apr 16 17:14:30.376270 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.376187 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83"} err="failed to get container status \"2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83\": rpc error: code = NotFound desc = could not find container \"2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83\": container with ID starting with 2f46dc812497cef33d64cfa071402796b645f7ec4742a6a097d6429a72e8bd83 not found: ID does not exist" Apr 16 17:14:30.387363 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.387340 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w"] Apr 16 17:14:30.389556 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.389537 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-45e11-predictor-5c496f76fd-sc45w"] Apr 16 17:14:30.853935 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:14:30.853891 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" path="/var/lib/kubelet/pods/ac6f9720-1c2e-45a3-9966-cfec5f0c30ca/volumes" Apr 16 17:19:04.942448 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:19:04.942339 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:19:04.950327 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:19:04.950308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:21:46.684884 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:46.684848 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h"] Apr 16 17:21:46.685403 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:46.685110 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" containerID="cri-o://c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5" gracePeriod=30 Apr 16 17:21:48.075753 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.075719 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rxh8x/must-gather-brlsv"] Apr 16 17:21:48.076237 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.076196 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" Apr 16 17:21:48.076237 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.076214 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" Apr 16 17:21:48.076237 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.076232 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" Apr 16 17:21:48.076393 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.076241 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" Apr 16 17:21:48.076393 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.076317 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6ba8407-fa0f-4b26-9596-912c8f91a37b" containerName="kserve-container" Apr 16 17:21:48.076393 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.076334 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac6f9720-1c2e-45a3-9966-cfec5f0c30ca" containerName="kserve-container" Apr 16 17:21:48.079350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.079330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.081804 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.081787 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rxh8x\"/\"kube-root-ca.crt\"" Apr 16 17:21:48.082534 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.082513 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rxh8x\"/\"openshift-service-ca.crt\"" Apr 16 17:21:48.082642 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.082547 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rxh8x\"/\"default-dockercfg-crdg6\"" Apr 16 17:21:48.098314 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.098290 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rxh8x/must-gather-brlsv"] Apr 16 17:21:48.167840 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.167802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rtv\" (UniqueName: \"kubernetes.io/projected/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-kube-api-access-x4rtv\") pod \"must-gather-brlsv\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.168043 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.167958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-must-gather-output\") pod \"must-gather-brlsv\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.268963 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.268908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-must-gather-output\") pod \"must-gather-brlsv\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.268963 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.268969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rtv\" (UniqueName: \"kubernetes.io/projected/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-kube-api-access-x4rtv\") pod \"must-gather-brlsv\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.269280 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.269260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-must-gather-output\") pod \"must-gather-brlsv\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.276663 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.276630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rtv\" (UniqueName: \"kubernetes.io/projected/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-kube-api-access-x4rtv\") pod \"must-gather-brlsv\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.405308 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.405230 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:21:48.524293 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.524263 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rxh8x/must-gather-brlsv"] Apr 16 17:21:48.526848 ip-10-0-128-64 kubenswrapper[2578]: W0416 17:21:48.526825 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod262008ee_5ec4_49bb_9765_e16fbfe0fbb4.slice/crio-e35a0f25b65d753bc0b240b45d6d5f12237736ed4012c1d1e0dfa77e616dfa83 WatchSource:0}: Error finding container e35a0f25b65d753bc0b240b45d6d5f12237736ed4012c1d1e0dfa77e616dfa83: Status 404 returned error can't find the container with id e35a0f25b65d753bc0b240b45d6d5f12237736ed4012c1d1e0dfa77e616dfa83 Apr 16 17:21:48.528821 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.528806 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:21:48.603554 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.603522 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh8x/must-gather-brlsv" event={"ID":"262008ee-5ec4-49bb-9765-e16fbfe0fbb4","Type":"ContainerStarted","Data":"e35a0f25b65d753bc0b240b45d6d5f12237736ed4012c1d1e0dfa77e616dfa83"} Apr 16 17:21:48.660206 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:48.660130 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 17:21:49.948218 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:49.948199 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" Apr 16 17:21:50.610843 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.610804 2578 generic.go:358] "Generic (PLEG): container finished" podID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerID="c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5" exitCode=0 Apr 16 17:21:50.611050 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.610868 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" Apr 16 17:21:50.611050 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.610901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" event={"ID":"0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b","Type":"ContainerDied","Data":"c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5"} Apr 16 17:21:50.611050 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.610958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h" event={"ID":"0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b","Type":"ContainerDied","Data":"cd099bc3056102982112c7674713ac29e3aa2d159d77449b435392c1afe34a0e"} Apr 16 17:21:50.611050 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.610980 2578 scope.go:117] "RemoveContainer" containerID="c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5" Apr 16 17:21:50.620144 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.620123 2578 scope.go:117] "RemoveContainer" containerID="c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5" Apr 16 17:21:50.620444 ip-10-0-128-64 kubenswrapper[2578]: E0416 17:21:50.620413 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5\": container with ID starting with c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5 not found: ID does not exist" containerID="c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5" Apr 16 17:21:50.620554 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.620451 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5"} err="failed to get container status \"c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5\": rpc error: code = NotFound desc = could not find container \"c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5\": container with ID starting with c2aca986020dfbcf62beab4f8824c32b323b7c03f8082b73e37a3fd3111ec4f5 not found: ID does not exist" Apr 16 17:21:50.633100 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.633071 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h"] Apr 16 17:21:50.636561 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.636534 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9f91-predictor-cb47fbcb4-rhs7h"] Apr 16 17:21:50.853436 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:50.853406 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" path="/var/lib/kubelet/pods/0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b/volumes" Apr 16 17:21:53.622865 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:53.622835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh8x/must-gather-brlsv" event={"ID":"262008ee-5ec4-49bb-9765-e16fbfe0fbb4","Type":"ContainerStarted","Data":"0564ac7fc2d674ae7775252d93712a4b6ac7aadc4fb529afcbc1c615d9acc187"} Apr 16 17:21:53.622865 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:53.622870 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh8x/must-gather-brlsv" event={"ID":"262008ee-5ec4-49bb-9765-e16fbfe0fbb4","Type":"ContainerStarted","Data":"1e4575ad03465aa0c58c6269e583258e996d8ce70abdffc6b32f0fc41db3828f"} Apr 16 17:21:53.651467 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:21:53.651420 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rxh8x/must-gather-brlsv" podStartSLOduration=0.952709264 podStartE2EDuration="5.651404881s" podCreationTimestamp="2026-04-16 17:21:48 +0000 UTC" firstStartedPulling="2026-04-16 17:21:48.528957932 +0000 UTC m=+3464.251502300" lastFinishedPulling="2026-04-16 17:21:53.227653539 +0000 UTC m=+3468.950197917" observedRunningTime="2026-04-16 17:21:53.649184393 +0000 UTC m=+3469.371728782" watchObservedRunningTime="2026-04-16 17:21:53.651404881 +0000 UTC m=+3469.373949318" Apr 16 17:22:11.679647 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:11.679612 2578 generic.go:358] "Generic (PLEG): container finished" podID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerID="1e4575ad03465aa0c58c6269e583258e996d8ce70abdffc6b32f0fc41db3828f" exitCode=0 Apr 16 17:22:11.680071 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:11.679684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxh8x/must-gather-brlsv" event={"ID":"262008ee-5ec4-49bb-9765-e16fbfe0fbb4","Type":"ContainerDied","Data":"1e4575ad03465aa0c58c6269e583258e996d8ce70abdffc6b32f0fc41db3828f"} Apr 16 17:22:11.680071 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:11.680062 2578 scope.go:117] "RemoveContainer" containerID="1e4575ad03465aa0c58c6269e583258e996d8ce70abdffc6b32f0fc41db3828f" Apr 16 17:22:11.990593 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:11.990514 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh8x_must-gather-brlsv_262008ee-5ec4-49bb-9765-e16fbfe0fbb4/gather/0.log" Apr 16 17:22:12.565100 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.565063 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-65l7w/must-gather-7jt7b"] Apr 16 17:22:12.565366 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.565354 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" Apr 16 17:22:12.565409 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.565367 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" Apr 16 17:22:12.565443 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.565419 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bd3d6ac-f4fe-4a34-9eef-e99722e2e59b" containerName="kserve-container" Apr 16 17:22:12.568086 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.568071 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.570633 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.570604 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-65l7w\"/\"kube-root-ca.crt\"" Apr 16 17:22:12.570633 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.570608 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-65l7w\"/\"openshift-service-ca.crt\"" Apr 16 17:22:12.571327 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.571309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-65l7w\"/\"default-dockercfg-srjc9\"" Apr 16 17:22:12.575589 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.575563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-65l7w/must-gather-7jt7b"] Apr 16 17:22:12.674970 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.674894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc2bf135-f3c3-4f03-af09-8949a21ed2bb-must-gather-output\") pod \"must-gather-7jt7b\" (UID: \"fc2bf135-f3c3-4f03-af09-8949a21ed2bb\") " pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.675166 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.675034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhlpl\" (UniqueName: \"kubernetes.io/projected/fc2bf135-f3c3-4f03-af09-8949a21ed2bb-kube-api-access-xhlpl\") pod \"must-gather-7jt7b\" (UID: \"fc2bf135-f3c3-4f03-af09-8949a21ed2bb\") " pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.776393 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.776354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc2bf135-f3c3-4f03-af09-8949a21ed2bb-must-gather-output\") pod \"must-gather-7jt7b\" (UID: \"fc2bf135-f3c3-4f03-af09-8949a21ed2bb\") " pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.776772 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.776411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhlpl\" (UniqueName: \"kubernetes.io/projected/fc2bf135-f3c3-4f03-af09-8949a21ed2bb-kube-api-access-xhlpl\") pod \"must-gather-7jt7b\" (UID: \"fc2bf135-f3c3-4f03-af09-8949a21ed2bb\") " pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.776772 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.776671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fc2bf135-f3c3-4f03-af09-8949a21ed2bb-must-gather-output\") pod \"must-gather-7jt7b\" (UID: \"fc2bf135-f3c3-4f03-af09-8949a21ed2bb\") " pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.784118 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.784092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhlpl\" (UniqueName: \"kubernetes.io/projected/fc2bf135-f3c3-4f03-af09-8949a21ed2bb-kube-api-access-xhlpl\") pod \"must-gather-7jt7b\" (UID: \"fc2bf135-f3c3-4f03-af09-8949a21ed2bb\") " pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.876738 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.876655 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-65l7w/must-gather-7jt7b" Apr 16 17:22:12.994178 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:12.994156 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-65l7w/must-gather-7jt7b"] Apr 16 17:22:12.996567 ip-10-0-128-64 kubenswrapper[2578]: W0416 17:22:12.996529 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc2bf135_f3c3_4f03_af09_8949a21ed2bb.slice/crio-254e99be06d3e3ecafc784d16472b1c49c3574110ce1540d247f5afecdd99ccf WatchSource:0}: Error finding container 254e99be06d3e3ecafc784d16472b1c49c3574110ce1540d247f5afecdd99ccf: Status 404 returned error can't find the container with id 254e99be06d3e3ecafc784d16472b1c49c3574110ce1540d247f5afecdd99ccf Apr 16 17:22:13.686228 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:13.686197 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/must-gather-7jt7b" event={"ID":"fc2bf135-f3c3-4f03-af09-8949a21ed2bb","Type":"ContainerStarted","Data":"254e99be06d3e3ecafc784d16472b1c49c3574110ce1540d247f5afecdd99ccf"} Apr 16 17:22:14.691479 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:14.691441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/must-gather-7jt7b" event={"ID":"fc2bf135-f3c3-4f03-af09-8949a21ed2bb","Type":"ContainerStarted","Data":"a87e4e3a78b4abeb4d0a3bcb5effa04caf4289033ebcab10dc6e63dbc5d8f858"} Apr 16 17:22:14.691479 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:14.691485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/must-gather-7jt7b" event={"ID":"fc2bf135-f3c3-4f03-af09-8949a21ed2bb","Type":"ContainerStarted","Data":"85d577c8fa18cb3791bcf31732ac1f309b1c4ff769dbb34cd89699b858f5fe8d"} Apr 16 17:22:14.708893 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:14.708829 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-65l7w/must-gather-7jt7b" podStartSLOduration=1.85010902 podStartE2EDuration="2.708807955s" podCreationTimestamp="2026-04-16 17:22:12 +0000 UTC" firstStartedPulling="2026-04-16 17:22:12.998356803 +0000 UTC m=+3488.720901177" lastFinishedPulling="2026-04-16 17:22:13.857055737 +0000 UTC m=+3489.579600112" observedRunningTime="2026-04-16 17:22:14.706713237 +0000 UTC m=+3490.429257627" watchObservedRunningTime="2026-04-16 17:22:14.708807955 +0000 UTC m=+3490.431352346" Apr 16 17:22:15.206245 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:15.206206 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cb8q7_f5a24ad4-a379-47f2-bc15-10eccd9d9898/global-pull-secret-syncer/0.log" Apr 16 17:22:15.420154 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:15.420123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-x7ld8_9d333c14-36c2-41c1-af1c-1c7a8daaee58/konnectivity-agent/0.log" Apr 16 17:22:15.465845 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:15.465765 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-64.ec2.internal_b52ab70ac24bb6b9ef3948f422fe6d61/haproxy/0.log" Apr 16 17:22:17.401727 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.401687 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rxh8x/must-gather-brlsv"] Apr 16 17:22:17.402226 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.401978 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rxh8x/must-gather-brlsv" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="copy" containerID="cri-o://0564ac7fc2d674ae7775252d93712a4b6ac7aadc4fb529afcbc1c615d9acc187" gracePeriod=2 Apr 16 17:22:17.406769 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.406736 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rxh8x/must-gather-brlsv"] Apr 16 17:22:17.720784 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.709361 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh8x_must-gather-brlsv_262008ee-5ec4-49bb-9765-e16fbfe0fbb4/copy/0.log" Apr 16 17:22:17.720784 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.709717 2578 generic.go:358] "Generic (PLEG): container finished" podID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerID="0564ac7fc2d674ae7775252d93712a4b6ac7aadc4fb529afcbc1c615d9acc187" exitCode=143 Apr 16 17:22:17.751337 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.750629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh8x_must-gather-brlsv_262008ee-5ec4-49bb-9765-e16fbfe0fbb4/copy/0.log" Apr 16 17:22:17.751337 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.751050 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:22:17.757242 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.757185 2578 status_manager.go:895] "Failed to get status for pod" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" pod="openshift-must-gather-rxh8x/must-gather-brlsv" err="pods \"must-gather-brlsv\" is forbidden: User \"system:node:ip-10-0-128-64.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rxh8x\": no relationship found between node 'ip-10-0-128-64.ec2.internal' and this object" Apr 16 17:22:17.822460 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.821792 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-must-gather-output\") pod \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " Apr 16 17:22:17.822460 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.821851 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4rtv\" (UniqueName: \"kubernetes.io/projected/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-kube-api-access-x4rtv\") pod \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\" (UID: \"262008ee-5ec4-49bb-9765-e16fbfe0fbb4\") " Apr 16 17:22:17.825264 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.824149 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "262008ee-5ec4-49bb-9765-e16fbfe0fbb4" (UID: "262008ee-5ec4-49bb-9765-e16fbfe0fbb4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:22:17.832479 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.832431 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-kube-api-access-x4rtv" (OuterVolumeSpecName: "kube-api-access-x4rtv") pod "262008ee-5ec4-49bb-9765-e16fbfe0fbb4" (UID: "262008ee-5ec4-49bb-9765-e16fbfe0fbb4"). InnerVolumeSpecName "kube-api-access-x4rtv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:22:17.923220 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.923147 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-must-gather-output\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 17:22:17.923220 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:17.923182 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4rtv\" (UniqueName: \"kubernetes.io/projected/262008ee-5ec4-49bb-9765-e16fbfe0fbb4-kube-api-access-x4rtv\") on node \"ip-10-0-128-64.ec2.internal\" DevicePath \"\"" Apr 16 17:22:18.715098 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.715071 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxh8x_must-gather-brlsv_262008ee-5ec4-49bb-9765-e16fbfe0fbb4/copy/0.log" Apr 16 17:22:18.716139 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.716115 2578 scope.go:117] "RemoveContainer" containerID="0564ac7fc2d674ae7775252d93712a4b6ac7aadc4fb529afcbc1c615d9acc187" Apr 16 17:22:18.716755 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.716492 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxh8x/must-gather-brlsv" Apr 16 17:22:18.719607 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.719571 2578 status_manager.go:895] "Failed to get status for pod" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" pod="openshift-must-gather-rxh8x/must-gather-brlsv" err="pods \"must-gather-brlsv\" is forbidden: User \"system:node:ip-10-0-128-64.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rxh8x\": no relationship found between node 'ip-10-0-128-64.ec2.internal' and this object" Apr 16 17:22:18.738136 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.738110 2578 scope.go:117] "RemoveContainer" containerID="1e4575ad03465aa0c58c6269e583258e996d8ce70abdffc6b32f0fc41db3828f" Apr 16 17:22:18.754658 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.754615 2578 status_manager.go:895] "Failed to get status for pod" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" pod="openshift-must-gather-rxh8x/must-gather-brlsv" err="pods \"must-gather-brlsv\" is forbidden: User \"system:node:ip-10-0-128-64.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rxh8x\": no relationship found between node 'ip-10-0-128-64.ec2.internal' and this object" Apr 16 17:22:18.856873 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:18.856833 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" path="/var/lib/kubelet/pods/262008ee-5ec4-49bb-9765-e16fbfe0fbb4/volumes" Apr 16 17:22:19.272875 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.272815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-4rntv_6e8dbc9b-a73b-491e-802f-609a1250cd4b/kube-state-metrics/0.log" Apr 16 17:22:19.291635 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.291603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-4rntv_6e8dbc9b-a73b-491e-802f-609a1250cd4b/kube-rbac-proxy-main/0.log" Apr 16 17:22:19.313811 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.313782 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-4rntv_6e8dbc9b-a73b-491e-802f-609a1250cd4b/kube-rbac-proxy-self/0.log" Apr 16 17:22:19.396583 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.396554 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jdflm_41d52522-7c10-4ddf-9497-d3fd714c18b9/node-exporter/0.log" Apr 16 17:22:19.420436 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.420409 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jdflm_41d52522-7c10-4ddf-9497-d3fd714c18b9/kube-rbac-proxy/0.log" Apr 16 17:22:19.441200 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.441175 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jdflm_41d52522-7c10-4ddf-9497-d3fd714c18b9/init-textfile/0.log" Apr 16 17:22:19.614729 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.614652 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-dzwlh_c66366bc-9d81-43f5-af40-4e9c08a337d5/kube-rbac-proxy-main/0.log" Apr 16 17:22:19.639577 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.639517 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-dzwlh_c66366bc-9d81-43f5-af40-4e9c08a337d5/kube-rbac-proxy-self/0.log" Apr 16 17:22:19.662128 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.662057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-dzwlh_c66366bc-9d81-43f5-af40-4e9c08a337d5/openshift-state-metrics/0.log" Apr 16 17:22:19.706184 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.706047 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/prometheus/0.log" Apr 16 17:22:19.726708 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.726684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/config-reloader/0.log" Apr 16 17:22:19.747221 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.747191 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/thanos-sidecar/0.log" Apr 16 17:22:19.767833 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.767809 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/kube-rbac-proxy-web/0.log" Apr 16 17:22:19.787008 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.786984 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/kube-rbac-proxy/0.log" Apr 16 17:22:19.805855 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.805821 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/kube-rbac-proxy-thanos/0.log" Apr 16 17:22:19.829364 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.829335 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_884d6486-6a24-4277-850c-a3725856c08b/init-config-reloader/0.log" Apr 16 17:22:19.857962 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.857915 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-nkz7c_91db00c7-daa7-456d-93c4-bda81def2d2d/prometheus-operator/0.log" Apr 16 17:22:19.877873 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:19.877799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-nkz7c_91db00c7-daa7-456d-93c4-bda81def2d2d/kube-rbac-proxy/0.log" Apr 16 17:22:22.066297 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.066245 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-ht495_774f54d2-48ef-4bb5-a2eb-225ce1d17845/download-server/0.log" Apr 16 17:22:22.660075 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660043 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j"] Apr 16 17:22:22.660368 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660353 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="copy" Apr 16 17:22:22.660368 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660368 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="copy" Apr 16 17:22:22.660484 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660377 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="gather" Apr 16 17:22:22.660484 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660383 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="gather" Apr 16 17:22:22.660484 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660435 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="gather" Apr 16 17:22:22.660484 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.660443 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="262008ee-5ec4-49bb-9765-e16fbfe0fbb4" containerName="copy" Apr 16 17:22:22.663862 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.663801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.670498 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.670452 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j"] Apr 16 17:22:22.764881 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.764850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspt7\" (UniqueName: \"kubernetes.io/projected/a380a43b-5300-4c78-9b34-f3d7e93afb43-kube-api-access-dspt7\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.765296 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.764897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-podres\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.765296 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.765025 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-proc\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.765296 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.765069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-lib-modules\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.765296 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.765125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-sys\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866110 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-proc\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866110 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-lib-modules\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-sys\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dspt7\" (UniqueName: \"kubernetes.io/projected/a380a43b-5300-4c78-9b34-f3d7e93afb43-kube-api-access-dspt7\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-podres\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-proc\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866244 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-sys\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-lib-modules\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.866350 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.866306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a380a43b-5300-4c78-9b34-f3d7e93afb43-podres\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.874501 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.874475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspt7\" (UniqueName: \"kubernetes.io/projected/a380a43b-5300-4c78-9b34-f3d7e93afb43-kube-api-access-dspt7\") pod \"perf-node-gather-daemonset-j6x2j\" (UID: \"a380a43b-5300-4c78-9b34-f3d7e93afb43\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:22.975979 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:22.975883 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:23.125976 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.125948 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j"] Apr 16 17:22:23.128583 ip-10-0-128-64 kubenswrapper[2578]: W0416 17:22:23.128556 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda380a43b_5300_4c78_9b34_f3d7e93afb43.slice/crio-8a1752e42b5869de5fae27443700f3ace13032902c73d3b6215d95b53cac8c02 WatchSource:0}: Error finding container 8a1752e42b5869de5fae27443700f3ace13032902c73d3b6215d95b53cac8c02: Status 404 returned error can't find the container with id 8a1752e42b5869de5fae27443700f3ace13032902c73d3b6215d95b53cac8c02 Apr 16 17:22:23.142696 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.142678 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7ngj9_fc46429e-6d00-4382-9497-f38ad024a4a7/dns/0.log" Apr 16 17:22:23.161351 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.161330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7ngj9_fc46429e-6d00-4382-9497-f38ad024a4a7/kube-rbac-proxy/0.log" Apr 16 17:22:23.226681 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.226614 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4stxv_e6722ce7-f829-490e-9ee8-97b3a979ca09/dns-node-resolver/0.log" Apr 16 17:22:23.730699 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.730623 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cv6tw_8a2d315f-d3a3-4cab-97d9-becca3a12249/node-ca/0.log" Apr 16 17:22:23.734911 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.734886 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" event={"ID":"a380a43b-5300-4c78-9b34-f3d7e93afb43","Type":"ContainerStarted","Data":"bd572851eef421be34c06e68b7a00b21913a077db8a0e206a043f71a8926ba35"} Apr 16 17:22:23.734911 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.734914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" event={"ID":"a380a43b-5300-4c78-9b34-f3d7e93afb43","Type":"ContainerStarted","Data":"8a1752e42b5869de5fae27443700f3ace13032902c73d3b6215d95b53cac8c02"} Apr 16 17:22:23.735138 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.735047 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:23.750605 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:23.750558 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" podStartSLOduration=1.750542222 podStartE2EDuration="1.750542222s" podCreationTimestamp="2026-04-16 17:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:22:23.749543496 +0000 UTC m=+3499.472087883" watchObservedRunningTime="2026-04-16 17:22:23.750542222 +0000 UTC m=+3499.473086610" Apr 16 17:22:24.749575 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:24.749549 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9gp2s_3221ff3b-b85e-4d3b-9c0f-30052ddc6800/serve-healthcheck-canary/0.log" Apr 16 17:22:25.162489 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:25.162456 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c2j7c_024ed9a3-afbd-4d04-8b6a-d9546f86f606/kube-rbac-proxy/0.log" Apr 16 17:22:25.183714 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:25.183689 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c2j7c_024ed9a3-afbd-4d04-8b6a-d9546f86f606/exporter/0.log" Apr 16 17:22:25.206042 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:25.206015 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c2j7c_024ed9a3-afbd-4d04-8b6a-d9546f86f606/extractor/0.log" Apr 16 17:22:27.297033 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:27.297007 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-jpt94_b4125926-3da6-4244-9477-f06e200b8975/server/0.log" Apr 16 17:22:27.565189 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:27.565118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-kx9rj_bda92557-f9bd-421c-9ea3-ee47a8c11b2f/manager/0.log" Apr 16 17:22:27.583750 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:27.583725 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-lks8k_70f5c9b7-0054-4a03-9d0e-4098e0c8ba03/s3-init/0.log" Apr 16 17:22:27.617524 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:27.617494 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-vt4x6_44fe971d-bc11-4b96-b457-a2b8b58c0639/seaweedfs/0.log" Apr 16 17:22:29.749661 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:29.749632 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-j6x2j" Apr 16 17:22:32.916970 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:32.916946 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/kube-multus-additional-cni-plugins/0.log" Apr 16 17:22:32.951282 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:32.951255 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/egress-router-binary-copy/0.log" Apr 16 17:22:32.986305 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:32.986283 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/cni-plugins/0.log" Apr 16 17:22:33.023932 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.023892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/bond-cni-plugin/0.log" Apr 16 17:22:33.061201 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.061176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/routeoverride-cni/0.log" Apr 16 17:22:33.102300 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.102276 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/whereabouts-cni-bincopy/0.log" Apr 16 17:22:33.135587 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.135560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wflp7_7f749d5c-04b9-4ecc-8853-c8deff057ad4/whereabouts-cni/0.log" Apr 16 17:22:33.221914 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.221841 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lsqww_0cad9b82-f6b2-4330-bd44-9e2cbb2b5e59/kube-multus/0.log" Apr 16 17:22:33.432502 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.432478 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-stfn4_0b0c36b6-3279-4629-991c-70026ff0d0b6/network-metrics-daemon/0.log" Apr 16 17:22:33.471746 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:33.471714 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-stfn4_0b0c36b6-3279-4629-991c-70026ff0d0b6/kube-rbac-proxy/0.log" Apr 16 17:22:34.929096 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:34.929064 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-controller/0.log" Apr 16 17:22:34.946844 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:34.946821 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/0.log" Apr 16 17:22:34.962369 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:34.962340 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovn-acl-logging/1.log" Apr 16 17:22:34.978743 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:34.978717 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/kube-rbac-proxy-node/0.log" Apr 16 17:22:34.997480 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:34.997451 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:22:35.018461 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:35.018432 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/northd/0.log" Apr 16 17:22:35.039700 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:35.039593 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/nbdb/0.log" Apr 16 17:22:35.058600 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:35.058562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/sbdb/0.log" Apr 16 17:22:35.175937 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:35.175895 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dj5w9_3ff3f6de-097e-4812-8f8e-276c41254178/ovnkube-controller/0.log" Apr 16 17:22:36.186314 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:36.186282 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-n26xz_eeb0bd2c-94b2-4f4f-a32a-8624aa8a7d0e/network-check-target-container/0.log" Apr 16 17:22:37.099658 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:37.099631 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vjlfb_e746431a-e308-4ccf-87fa-0969f2b40152/iptables-alerter/0.log" Apr 16 17:22:37.676631 ip-10-0-128-64 kubenswrapper[2578]: I0416 17:22:37.676600 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-57m87_e95b712d-7106-4990-96bd-48d8764b3a55/tuned/0.log"