Apr 16 13:56:35.679109 ip-10-0-130-195 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:35.679121 ip-10-0-130-195 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:35.679128 ip-10-0-130-195 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:35.679335 ip-10-0-130-195 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:45.856247 ip-10-0-130-195 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:45.856265 ip-10-0-130-195 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6470cc8d09714039a4fdb83ad346449a -- Apr 16 13:59:03.821970 ip-10-0-130-195 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:04.287338 ip-10-0-130-195 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:04.287338 ip-10-0-130-195 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:04.287338 ip-10-0-130-195 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:04.287338 ip-10-0-130-195 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:04.287338 ip-10-0-130-195 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:04.289815 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.289726 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:04.294512 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294494 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:04.294512 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294511 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:04.294512 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294515 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294518 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294521 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294524 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294528 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294531 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294533 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294536 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294539 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294541 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294544 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294547 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294556 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294559 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294562 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294565 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294567 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294570 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294573 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294575 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:04.294612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294578 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294581 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294584 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294586 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294589 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294592 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294594 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294597 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294599 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294602 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294605 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294608 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294610 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294615 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294619 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294622 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294626 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294630 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294632 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:04.295110 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294635 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294637 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294640 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294642 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294645 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294647 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294651 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294653 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294656 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294658 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294661 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294676 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294679 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294681 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294684 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294688 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294691 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294693 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294696 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294699 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:04.295599 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294702 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294704 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294707 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294710 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294713 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294715 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294718 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294721 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294723 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294726 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294730 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294734 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294736 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294739 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294742 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294746 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294749 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294751 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294754 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294757 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:04.296156 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294759 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294762 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294765 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294767 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.294770 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296576 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296586 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296590 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296593 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296595 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296598 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296601 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296603 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296606 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296608 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296611 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296613 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296616 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296618 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296621 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:04.296701 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296623 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296626 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296628 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296631 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296633 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296636 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296640 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296644 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296647 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296650 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296653 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296656 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296659 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296663 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296688 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296691 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296694 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296697 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296700 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296703 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:04.297174 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296705 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296708 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296711 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296714 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296717 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296720 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296722 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296725 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296728 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296730 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296733 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296738 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296741 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296744 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296747 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296750 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296752 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296755 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296757 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:04.297893 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296760 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296762 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296765 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296767 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296770 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296773 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296776 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296779 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296782 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296785 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296787 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296790 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296793 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296796 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296799 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296801 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296804 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296806 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296809 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296811 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:04.298574 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296814 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296816 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296819 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296821 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296824 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296826 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296829 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296831 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296834 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296836 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296839 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.296841 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296913 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296920 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296927 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296931 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296936 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296940 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296946 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296951 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296955 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:04.299107 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296958 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296961 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296965 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296968 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296971 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296974 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296977 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296980 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296983 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296986 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296990 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296992 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296996 2573 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.296999 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297002 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297007 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297010 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297013 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297016 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297019 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297022 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297025 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297028 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297031 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297035 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:04.299628 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297038 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297044 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297047 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297050 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297053 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297059 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297062 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297066 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297069 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297072 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297076 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297079 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297082 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297085 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297088 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297091 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297094 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297097 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297100 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297103 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297106 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297109 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297112 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297115 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297118 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297121 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:04.300247 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297125 2573 flags.go:64] FLAG: --help="false" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297127 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297130 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297133 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297136 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297140 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297143 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297147 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297150 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297153 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297157 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297160 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297164 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297167 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297170 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297173 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297176 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297179 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297182 2573 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297186 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297189 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297192 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297197 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:04.300903 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297201 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297204 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297207 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297210 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297213 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297216 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297219 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297223 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297227 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297231 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297234 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297237 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297240 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297243 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297246 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297249 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297253 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297261 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297264 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297268 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297271 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297274 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297280 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297283 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:04.301451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297286 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297290 2573 flags.go:64] FLAG: --port="10250" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297293 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297296 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a52990e952e1f4e5" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297299 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297302 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297306 2573 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297308 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297311 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297315 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297318 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297321 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297324 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297327 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297330 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297333 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297336 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297339 2573 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297342 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297345 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297348 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297351 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297354 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297357 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297362 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297365 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:04.302078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297368 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297371 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297375 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297378 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297381 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297384 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297387 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297393 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297396 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297399 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297402 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297405 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297408 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297411 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297414 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297417 2573 flags.go:64] FLAG: --v="2" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297421 2573 flags.go:64] FLAG: --version="false" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297425 2573 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297429 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.297432 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298721 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298736 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298740 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298743 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:04.302733 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298746 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298753 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298756 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298760 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298763 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298765 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298768 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298771 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298774 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298777 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298780 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298783 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298786 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298791 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298795 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298798 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298800 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298803 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298806 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298809 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:04.303315 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298811 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298814 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298817 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298820 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298823 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298826 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298831 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298834 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298837 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298841 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298844 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298847 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298850 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298853 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298856 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298859 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298862 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298864 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298869 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:04.303824 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298872 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298875 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298878 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298881 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298883 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298886 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298889 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298892 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298895 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298898 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298901 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298904 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298909 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298912 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298914 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298917 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298920 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298922 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298927 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:04.304286 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298931 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298934 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298937 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298941 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298944 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298949 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298951 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298954 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298957 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298960 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298963 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298966 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298968 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298971 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298974 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298976 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298979 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298982 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298987 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298990 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:04.304778 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298993 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:04.305274 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.298997 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:04.305274 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.299001 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:04.305274 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.299005 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:04.305274 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.299934 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:04.306446 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.306427 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:04.306479 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.306446 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:04.306506 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306495 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:04.306506 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306500 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:04.306506 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306504 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306508 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306512 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306516 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306519 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306522 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306525 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306528 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306530 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306533 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306536 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306539 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306541 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306544 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306547 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306550 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306553 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306555 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306558 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:04.306620 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306561 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306563 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306566 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306569 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306571 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306573 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306577 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306580 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306582 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306585 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306588 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306590 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306593 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306595 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306598 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306601 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306604 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306606 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306609 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306611 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:04.307121 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306614 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306616 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306619 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306621 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306624 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306627 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306629 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306633 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306635 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306638 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306641 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306643 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306646 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306648 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306651 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306654 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306656 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306661 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306682 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306686 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:04.307612 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306689 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306692 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306695 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306698 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306701 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306704 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306707 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306710 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306713 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306716 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306719 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306721 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306724 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306726 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306729 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306732 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306734 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306737 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306740 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306742 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:04.308130 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306745 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306748 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306751 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306754 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306756 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.306761 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306857 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306863 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306866 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306869 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306872 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306876 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306879 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306882 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306885 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306887 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:04.308610 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306890 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306893 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306895 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306898 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306901 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306904 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306906 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306909 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306912 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306915 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306917 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306920 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306922 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306925 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306928 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306930 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306933 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306935 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306938 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306940 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:04.309027 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306943 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306945 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306948 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306951 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306953 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306956 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306958 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306961 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306964 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306966 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306969 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306971 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306974 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306977 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306980 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306985 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306987 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306990 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306992 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306995 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:04.309520 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.306998 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307001 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307003 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307006 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307008 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307011 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307014 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307016 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307020 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307024 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307027 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307029 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307032 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307035 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307037 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307040 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307044 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307046 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307049 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:04.310032 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307052 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307054 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307057 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307060 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307062 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307065 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307067 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307070 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307072 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307075 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307078 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307080 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307083 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307085 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307088 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307090 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:04.310497 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:04.307093 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:04.310910 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.307098 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:04.310910 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.307899 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:04.310910 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.310734 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:04.311781 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.311768 2573 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:04.311882 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.311865 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:04.311920 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.311911 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:04.339176 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.339154 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:04.343972 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.343953 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:04.362722 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.362697 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:04.368244 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.368102 2573 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:04.370798 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.370783 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:04.371261 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.371245 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:04.374886 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.374859 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a7338b73-011e-49e6-8c83-1eff28311ac3:/dev/nvme0n1p3 ba9afab2-e60c-4657-a433-e87b00fc3fab:/dev/nvme0n1p4] Apr 16 13:59:04.374976 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.374885 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:04.381989 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.381872 2573 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:04.380218135 +0000 UTC m=+0.424251312 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098947 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23b596c1fdd6505d8b3fbb3dc70242 SystemUUID:ec23b596-c1fd-d650-5d8b-3fbb3dc70242 BootID:6470cc8d-0971-4039-a4fd-b83ad346449a Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d4:41:c8:af:ed Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d4:41:c8:af:ed Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:65:ee:61:90:02 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:04.381989 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.381977 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:04.382122 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.382062 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:04.383775 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.383751 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:04.383925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.383777 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-195.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:04.383971 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.383932 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:04.383971 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.383941 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:04.383971 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.383954 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:04.385716 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.385705 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:04.386555 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.386545 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:04.386658 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.386650 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:04.389695 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.389685 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:04.389735 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.389700 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:04.389735 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.389721 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:04.389735 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.389730 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:04.389818 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.389739 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:04.391321 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.391013 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:04.391321 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.391033 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:04.394329 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.394307 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:04.396176 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.396159 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:04.397367 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397354 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397374 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397383 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397392 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397402 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397411 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397421 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397431 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:04.397445 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397441 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:04.397735 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397451 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:04.397735 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397471 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:04.397735 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.397487 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:04.398416 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.398404 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:04.398468 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.398419 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:04.401396 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.401368 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:04.401492 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.401373 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-195.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:04.401492 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.401462 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-195.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:04.402147 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.402135 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:04.402189 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.402170 2573 server.go:1295] "Started kubelet" Apr 16 13:59:04.402298 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.402242 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:04.402342 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.402288 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:04.402375 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.402362 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:04.403034 ip-10-0-130-195 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:04.404011 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.403819 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:04.404075 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.404064 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:04.409064 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.409048 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:04.409766 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.409741 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:04.410518 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.410497 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:04.410616 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.410496 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:04.410727 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.410708 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:04.410817 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.410792 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:04.410817 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.410805 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:04.411008 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.410991 2573 factory.go:153] Registering CRI-O factory Apr 16 13:59:04.411075 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411066 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:04.411143 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411128 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:04.411211 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411143 2573 factory.go:55] Registering systemd factory Apr 16 13:59:04.411211 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411151 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:04.411211 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411171 2573 factory.go:103] Registering Raw factory Apr 16 13:59:04.411211 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411183 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:04.411574 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.411551 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.411653 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.411589 2573 manager.go:319] Starting recovery of all containers Apr 16 13:59:04.412488 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.412445 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:04.414936 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.414908 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-24b4q" Apr 16 13:59:04.417824 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.417793 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-195.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:59:04.418440 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.418411 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:59:04.419217 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.417952 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-195.ec2.internal.18a6db0b3f95d818 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-195.ec2.internal,UID:ip-10-0-130-195.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-195.ec2.internal,},FirstTimestamp:2026-04-16 13:59:04.402147352 +0000 UTC m=+0.446180528,LastTimestamp:2026-04-16 13:59:04.402147352 +0000 UTC m=+0.446180528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-195.ec2.internal,}" Apr 16 13:59:04.419907 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.419883 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-24b4q" Apr 16 13:59:04.423862 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.423843 2573 manager.go:324] Recovery completed Apr 16 13:59:04.427952 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.427939 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:04.431626 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.431608 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:04.431725 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.431643 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:04.431725 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.431654 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:04.432154 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.432137 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:04.432154 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.432150 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:04.432240 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.432165 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:04.433690 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.433609 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-195.ec2.internal.18a6db0b4157ab60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-195.ec2.internal,UID:ip-10-0-130-195.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-195.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-195.ec2.internal,},FirstTimestamp:2026-04-16 13:59:04.431627104 +0000 UTC m=+0.475660282,LastTimestamp:2026-04-16 13:59:04.431627104 +0000 UTC m=+0.475660282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-195.ec2.internal,}" Apr 16 13:59:04.435020 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.435002 2573 policy_none.go:49] "None policy: Start" Apr 16 13:59:04.435093 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.435024 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:04.435093 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.435038 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:04.471259 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471244 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.471276 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471286 2573 server.go:85] "Starting device plugin registration server" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471517 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471526 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471610 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471711 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.471718 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.472212 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:04.484543 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.472242 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.510633 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.510606 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:04.511731 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.511716 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:04.511795 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.511743 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:04.511795 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.511770 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:04.511795 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.511777 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:04.511950 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.511812 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:04.515687 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.515651 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:04.572562 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.572479 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:04.573598 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.573579 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:04.573693 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.573613 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:04.573693 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.573630 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:04.573693 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.573657 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.582995 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.582976 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.583036 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.583000 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-195.ec2.internal\": node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.612816 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.612779 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal"] Apr 16 13:59:04.612923 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.612850 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:04.613696 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.613680 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:04.613751 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.613712 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:04.613751 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.613722 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:04.614970 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.614958 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:04.615100 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.615133 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615115 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:04.615653 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615636 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:04.615779 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615681 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:04.615779 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615683 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:04.615779 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615696 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:04.615779 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615708 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:04.615779 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.615719 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:04.617519 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.617504 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.617614 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.617531 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:04.618136 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.618121 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:04.618136 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.618149 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:04.618253 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.618162 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:04.637277 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.637257 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.648279 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.648262 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-195.ec2.internal\" not found" node="ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.652849 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.652834 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-195.ec2.internal\" not found" node="ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.711913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.711888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b944d19d12689cfa5af24924745cf2e9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal\" (UID: \"b944d19d12689cfa5af24924745cf2e9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.712055 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.711921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6a28db42b1f5e7c92afb50f5a79e5c07-config\") pod \"kube-apiserver-proxy-ip-10-0-130-195.ec2.internal\" (UID: \"6a28db42b1f5e7c92afb50f5a79e5c07\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.712055 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.711947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b944d19d12689cfa5af24924745cf2e9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal\" (UID: \"b944d19d12689cfa5af24924745cf2e9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.737658 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.737630 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.813042 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.813010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6a28db42b1f5e7c92afb50f5a79e5c07-config\") pod \"kube-apiserver-proxy-ip-10-0-130-195.ec2.internal\" (UID: \"6a28db42b1f5e7c92afb50f5a79e5c07\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.813194 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.813048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b944d19d12689cfa5af24924745cf2e9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal\" (UID: \"b944d19d12689cfa5af24924745cf2e9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.813194 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.813098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b944d19d12689cfa5af24924745cf2e9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal\" (UID: \"b944d19d12689cfa5af24924745cf2e9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.813194 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.813103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6a28db42b1f5e7c92afb50f5a79e5c07-config\") pod \"kube-apiserver-proxy-ip-10-0-130-195.ec2.internal\" (UID: \"6a28db42b1f5e7c92afb50f5a79e5c07\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.813194 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.813160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b944d19d12689cfa5af24924745cf2e9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal\" (UID: \"b944d19d12689cfa5af24924745cf2e9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.813334 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.813206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b944d19d12689cfa5af24924745cf2e9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal\" (UID: \"b944d19d12689cfa5af24924745cf2e9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.838181 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.838111 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.938952 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:04.938916 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:04.952119 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.952095 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" Apr 16 13:59:04.955793 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:04.955774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:05.039044 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.039009 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.139526 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.139459 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.239936 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.239913 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.294283 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.294253 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:05.312073 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.312052 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:05.312222 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.312204 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:05.312274 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.312218 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:05.340433 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.340411 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.409230 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.409138 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:05.418389 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.418369 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:05.422185 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.422156 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:04 +0000 UTC" deadline="2028-02-02 00:16:30.71246963 +0000 UTC" Apr 16 13:59:05.422185 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.422182 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15754h17m25.290290894s" Apr 16 13:59:05.422817 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:05.422792 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb944d19d12689cfa5af24924745cf2e9.slice/crio-b66b1f8e11a67d73623c9ba5c6626f27bacc1ac001cc2b3b44b33f4907528053 WatchSource:0}: Error finding container b66b1f8e11a67d73623c9ba5c6626f27bacc1ac001cc2b3b44b33f4907528053: Status 404 returned error can't find the container with id b66b1f8e11a67d73623c9ba5c6626f27bacc1ac001cc2b3b44b33f4907528053 Apr 16 13:59:05.423177 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:05.423161 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a28db42b1f5e7c92afb50f5a79e5c07.slice/crio-44c3730c43471a5d9db702ca0762fa65372ac2202fb31e7b8de04e8e79d8e805 WatchSource:0}: Error finding container 44c3730c43471a5d9db702ca0762fa65372ac2202fb31e7b8de04e8e79d8e805: Status 404 returned error can't find the container with id 44c3730c43471a5d9db702ca0762fa65372ac2202fb31e7b8de04e8e79d8e805 Apr 16 13:59:05.428717 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.428521 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:05.440547 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.440522 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.440616 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.440545 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-88rkz" Apr 16 13:59:05.451563 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.451545 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-88rkz" Apr 16 13:59:05.515030 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.514954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" event={"ID":"b944d19d12689cfa5af24924745cf2e9","Type":"ContainerStarted","Data":"b66b1f8e11a67d73623c9ba5c6626f27bacc1ac001cc2b3b44b33f4907528053"} Apr 16 13:59:05.515779 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.515760 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" event={"ID":"6a28db42b1f5e7c92afb50f5a79e5c07","Type":"ContainerStarted","Data":"44c3730c43471a5d9db702ca0762fa65372ac2202fb31e7b8de04e8e79d8e805"} Apr 16 13:59:05.540932 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.540911 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.641399 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:05.641367 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-195.ec2.internal\" not found" Apr 16 13:59:05.687584 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.687528 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:05.710787 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.710755 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" Apr 16 13:59:05.721624 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.721605 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:05.723541 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.723530 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" Apr 16 13:59:05.734224 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.734210 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:05.913241 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:05.913210 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:06.391183 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.391152 2573 apiserver.go:52] "Watching apiserver" Apr 16 13:59:06.398209 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.398182 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:06.399397 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.399378 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:06.401700 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.401660 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2","openshift-cluster-node-tuning-operator/tuned-r275w","openshift-image-registry/node-ca-kh2z9","openshift-multus/multus-xt4tf","openshift-network-diagnostics/network-check-target-cb8c5","openshift-ovn-kubernetes/ovnkube-node-kftpl","kube-system/konnectivity-agent-j8v85","openshift-dns/node-resolver-rh5gj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal","openshift-multus/multus-additional-cni-plugins-twv7g","openshift-multus/network-metrics-daemon-59j5c","openshift-network-operator/iptables-alerter-zbxjq"] Apr 16 13:59:06.403702 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.403682 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.405319 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.405292 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.406453 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406434 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406640 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2ffqx\"" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406657 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406786 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406796 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406801 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406805 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.406925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.406846 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:06.408611 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.407842 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.408611 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.407947 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.408611 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.408257 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p46mk\"" Apr 16 13:59:06.408831 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.408781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.409658 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.409635 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:06.409761 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.409718 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7cvl4\"" Apr 16 13:59:06.409761 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.409726 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.410993 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.410053 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.410993 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.410071 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:06.410993 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.410213 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.410993 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.410536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.410993 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.410693 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8vhv7\"" Apr 16 13:59:06.411432 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.411412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.412020 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.411581 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:06.412020 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.411687 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:06.414250 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.413809 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nnffk\"" Apr 16 13:59:06.414250 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.414001 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.414250 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.414034 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:06.414250 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.414140 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:06.415090 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.415061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.415581 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.415566 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.416482 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.416455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.416586 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.416571 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.417350 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.417321 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-67l4s\"" Apr 16 13:59:06.417527 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.417506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:06.417623 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.417538 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:06.417854 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.417829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:06.417939 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.417898 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:06.418592 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.418576 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-h5c4g\"" Apr 16 13:59:06.418877 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.418859 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:06.418963 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.418879 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.418963 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.418896 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:06.419166 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419150 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vsbf4\"" Apr 16 13:59:06.419238 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419184 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.419483 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419468 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.419943 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca354899-9eed-44c4-8f19-33e699024e89-cni-binary-copy\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420031 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420031 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420031 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-systemd\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420031 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.419999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-ovn\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420031 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c7126c-784b-47c6-88ca-c5375d70b493-ovn-node-metrics-cert\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-sys-fs\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-cni-multus\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-kubelet\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-conf-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420108 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-netns\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-systemd-units\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-ovnkube-script-lib\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysctl-d\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-ovnkube-config\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-log-socket\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snqx4\" (UniqueName: \"kubernetes.io/projected/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-kube-api-access-snqx4\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-var-lib-kubelet\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420379 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-run-netns\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-device-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-kubernetes\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-run\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npm8g\" (UniqueName: \"kubernetes.io/projected/a3c7126c-784b-47c6-88ca-c5375d70b493-kube-api-access-npm8g\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-sys\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-os-release\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-k8s-cni-cncf-io\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-cni-bin\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-kubelet\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-lib-modules\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-host\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420734 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwlg9\" (UniqueName: \"kubernetes.io/projected/f4089fed-dbf0-4e49-a815-79b762aba862-kube-api-access-jwlg9\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.420835 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-system-cni-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420812 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-modprobe-d\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420836 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-systemd\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-run-ovn-kubernetes\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-cni-bin\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-tuned\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-multus-certs\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.420996 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-cni-netd\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-registration-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4089fed-dbf0-4e49-a815-79b762aba862-host\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f4089fed-dbf0-4e49-a815-79b762aba862-serviceca\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-socket-dir-parent\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-etc-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421225 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysctl-conf\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-etc-kubernetes\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.421682 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjnbq\" (UniqueName: \"kubernetes.io/projected/ca354899-9eed-44c4-8f19-33e699024e89-kube-api-access-vjnbq\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-slash\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-env-overrides\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-etc-selinux\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-var-lib-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v98x\" (UniqueName: \"kubernetes.io/projected/47298b75-037b-4a3e-87aa-d3244b5c60f9-kube-api-access-4v98x\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-cni-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca354899-9eed-44c4-8f19-33e699024e89-multus-daemon-config\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-socket-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47298b75-037b-4a3e-87aa-d3244b5c60f9-tmp\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-hostroot\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-node-log\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysconfig\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-cnibin\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421731 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nnzhc\"" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421735 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.421931 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:06.422347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.422075 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:06.452553 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.452521 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:05 +0000 UTC" deadline="2027-11-10 17:29:02.964828246 +0000 UTC" Apr 16 13:59:06.452553 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.452550 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13755h29m56.51228244s" Apr 16 13:59:06.511997 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.511971 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:06.521955 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.521932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-socket-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.521962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47298b75-037b-4a3e-87aa-d3244b5c60f9-tmp\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.521980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-hostroot\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9473d2b0-bd27-437c-8163-34180f007c16-iptables-alerter-script\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-node-log\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysconfig\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-cnibin\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca354899-9eed-44c4-8f19-33e699024e89-cni-binary-copy\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.522110 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-node-log\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysconfig\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-cni-binary-copy\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-hostroot\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522137 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-cnibin\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-socket-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/19493ff7-1060-4a3e-af97-e32119792569-konnectivity-ca\") pod \"konnectivity-agent-j8v85\" (UID: \"19493ff7-1060-4a3e-af97-e32119792569\") " pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7lcp\" (UniqueName: \"kubernetes.io/projected/9473d2b0-bd27-437c-8163-34180f007c16-kube-api-access-r7lcp\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522333 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522353 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-tmp-dir\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-systemd\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-ovn\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.522411 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-systemd\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-run-ovn\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c7126c-784b-47c6-88ca-c5375d70b493-ovn-node-metrics-cert\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-sys-fs\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-cni-multus\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-kubelet\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-cni-multus\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-conf-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-kubelet\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca354899-9eed-44c4-8f19-33e699024e89-cni-binary-copy\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522605 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dskbw\" (UniqueName: \"kubernetes.io/projected/35776b6a-fb9e-462b-9392-ed0451ab2515-kube-api-access-dskbw\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-conf-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-sys-fs\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-netns\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-systemd-units\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-ovnkube-script-lib\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-systemd-units\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522778 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-netns\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.523161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysctl-d\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-tuning-conf-dir\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522878 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-ovnkube-config\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysctl-d\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.522916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-log-socket\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-log-socket\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snqx4\" (UniqueName: \"kubernetes.io/projected/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-kube-api-access-snqx4\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-var-lib-kubelet\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-var-lib-kubelet\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-run-netns\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523216 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-device-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-kubernetes\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-run\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-run-netns\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npm8g\" (UniqueName: \"kubernetes.io/projected/a3c7126c-784b-47c6-88ca-c5375d70b493-kube-api-access-npm8g\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-ovnkube-script-lib\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.523960 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-device-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-ovnkube-config\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-kubernetes\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-sys\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-run\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-os-release\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-sys\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-k8s-cni-cncf-io\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-cni-bin\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-os-release\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-kubelet\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-k8s-cni-cncf-io\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-var-lib-cni-bin\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-lib-modules\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-host\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-kubelet\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.524849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwlg9\" (UniqueName: \"kubernetes.io/projected/f4089fed-dbf0-4e49-a815-79b762aba862-kube-api-access-jwlg9\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-lib-modules\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-host\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523568 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-system-cni-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-modprobe-d\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-system-cni-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-systemd\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-cnibin\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-systemd\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9473d2b0-bd27-437c-8163-34180f007c16-host-slash\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-hosts-file\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-modprobe-d\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-run-ovn-kubernetes\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-run-ovn-kubernetes\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523874 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-cni-bin\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-tuned\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.525684 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-multus-certs\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-cni-bin\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7rn\" (UniqueName: \"kubernetes.io/projected/45ca9ed1-9528-4529-8ffc-64027bd9e40a-kube-api-access-7z7rn\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.523994 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-host-run-multus-certs\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-cni-netd\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-registration-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4089fed-dbf0-4e49-a815-79b762aba862-host\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f4089fed-dbf0-4e49-a815-79b762aba862-serviceca\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-socket-dir-parent\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524147 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-cni-netd\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-registration-dir\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524174 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-os-release\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-socket-dir-parent\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4089fed-dbf0-4e49-a815-79b762aba862-host\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-etc-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysctl-conf\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-etc-kubernetes\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjnbq\" (UniqueName: \"kubernetes.io/projected/ca354899-9eed-44c4-8f19-33e699024e89-kube-api-access-vjnbq\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-system-cni-dir\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/19493ff7-1060-4a3e-af97-e32119792569-agent-certs\") pod \"konnectivity-agent-j8v85\" (UID: \"19493ff7-1060-4a3e-af97-e32119792569\") " pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7fg4\" (UniqueName: \"kubernetes.io/projected/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-kube-api-access-f7fg4\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524448 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-sysctl-conf\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-slash\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-env-overrides\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-etc-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-etc-selinux\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-etc-kubernetes\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f4089fed-dbf0-4e49-a815-79b762aba862-serviceca\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524572 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-var-lib-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-var-lib-openvswitch\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-etc-selinux\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v98x\" (UniqueName: \"kubernetes.io/projected/47298b75-037b-4a3e-87aa-d3244b5c60f9-kube-api-access-4v98x\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-cni-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.526785 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca354899-9eed-44c4-8f19-33e699024e89-multus-daemon-config\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca354899-9eed-44c4-8f19-33e699024e89-multus-cni-dir\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c7126c-784b-47c6-88ca-c5375d70b493-env-overrides\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47298b75-037b-4a3e-87aa-d3244b5c60f9-tmp\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.524735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c7126c-784b-47c6-88ca-c5375d70b493-host-slash\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c7126c-784b-47c6-88ca-c5375d70b493-ovn-node-metrics-cert\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.525924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca354899-9eed-44c4-8f19-33e699024e89-multus-daemon-config\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.527216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.526101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/47298b75-037b-4a3e-87aa-d3244b5c60f9-etc-tuned\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.532135 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.532114 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:06.532443 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.532139 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:06.532443 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.532152 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ppfhm for pod openshift-network-diagnostics/network-check-target-cb8c5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:06.532443 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.532225 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm podName:c63a949a-1dfc-4b9a-a5f7-ab03e43113fb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:07.032205313 +0000 UTC m=+3.076238499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ppfhm" (UniqueName: "kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm") pod "network-check-target-cb8c5" (UID: "c63a949a-1dfc-4b9a-a5f7-ab03e43113fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:06.535203 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.535163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v98x\" (UniqueName: \"kubernetes.io/projected/47298b75-037b-4a3e-87aa-d3244b5c60f9-kube-api-access-4v98x\") pod \"tuned-r275w\" (UID: \"47298b75-037b-4a3e-87aa-d3244b5c60f9\") " pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.535536 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.535512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snqx4\" (UniqueName: \"kubernetes.io/projected/61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf-kube-api-access-snqx4\") pod \"aws-ebs-csi-driver-node-qvzv2\" (UID: \"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.535640 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.535535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npm8g\" (UniqueName: \"kubernetes.io/projected/a3c7126c-784b-47c6-88ca-c5375d70b493-kube-api-access-npm8g\") pod \"ovnkube-node-kftpl\" (UID: \"a3c7126c-784b-47c6-88ca-c5375d70b493\") " pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.535640 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.535595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwlg9\" (UniqueName: \"kubernetes.io/projected/f4089fed-dbf0-4e49-a815-79b762aba862-kube-api-access-jwlg9\") pod \"node-ca-kh2z9\" (UID: \"f4089fed-dbf0-4e49-a815-79b762aba862\") " pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.535927 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.535904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjnbq\" (UniqueName: \"kubernetes.io/projected/ca354899-9eed-44c4-8f19-33e699024e89-kube-api-access-vjnbq\") pod \"multus-xt4tf\" (UID: \"ca354899-9eed-44c4-8f19-33e699024e89\") " pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.626284 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-cnibin\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.626284 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9473d2b0-bd27-437c-8163-34180f007c16-host-slash\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-hosts-file\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7rn\" (UniqueName: \"kubernetes.io/projected/45ca9ed1-9528-4529-8ffc-64027bd9e40a-kube-api-access-7z7rn\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-cnibin\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626372 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9473d2b0-bd27-437c-8163-34180f007c16-host-slash\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-hosts-file\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-os-release\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-system-cni-dir\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/19493ff7-1060-4a3e-af97-e32119792569-agent-certs\") pod \"konnectivity-agent-j8v85\" (UID: \"19493ff7-1060-4a3e-af97-e32119792569\") " pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.626526 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7fg4\" (UniqueName: \"kubernetes.io/projected/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-kube-api-access-f7fg4\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.626891 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-system-cni-dir\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.626891 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:06.626891 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-os-release\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.626891 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.626637 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:06.626891 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.626694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9473d2b0-bd27-437c-8163-34180f007c16-iptables-alerter-script\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.627062 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:06.627045 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:07.127020113 +0000 UTC m=+3.171053300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:06.627139 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-cni-binary-copy\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.627139 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.627223 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/19493ff7-1060-4a3e-af97-e32119792569-konnectivity-ca\") pod \"konnectivity-agent-j8v85\" (UID: \"19493ff7-1060-4a3e-af97-e32119792569\") " pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.627223 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7lcp\" (UniqueName: \"kubernetes.io/projected/9473d2b0-bd27-437c-8163-34180f007c16-kube-api-access-r7lcp\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.627223 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-tmp-dir\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.627355 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dskbw\" (UniqueName: \"kubernetes.io/projected/35776b6a-fb9e-462b-9392-ed0451ab2515-kube-api-access-dskbw\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.627355 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627291 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-tuning-conf-dir\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.628058 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.627805 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.628180 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.628104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/19493ff7-1060-4a3e-af97-e32119792569-konnectivity-ca\") pod \"konnectivity-agent-j8v85\" (UID: \"19493ff7-1060-4a3e-af97-e32119792569\") " pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.628258 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.628185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-tmp-dir\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.628258 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.628222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9473d2b0-bd27-437c-8163-34180f007c16-iptables-alerter-script\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.630633 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.628710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35776b6a-fb9e-462b-9392-ed0451ab2515-tuning-conf-dir\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.631206 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.631084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-cni-binary-copy\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.631206 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.631108 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.631206 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.631126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/35776b6a-fb9e-462b-9392-ed0451ab2515-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.633093 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.633064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/19493ff7-1060-4a3e-af97-e32119792569-agent-certs\") pod \"konnectivity-agent-j8v85\" (UID: \"19493ff7-1060-4a3e-af97-e32119792569\") " pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.635052 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.635029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7rn\" (UniqueName: \"kubernetes.io/projected/45ca9ed1-9528-4529-8ffc-64027bd9e40a-kube-api-access-7z7rn\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:06.635185 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.635166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7fg4\" (UniqueName: \"kubernetes.io/projected/ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d-kube-api-access-f7fg4\") pod \"node-resolver-rh5gj\" (UID: \"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d\") " pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.635441 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.635419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7lcp\" (UniqueName: \"kubernetes.io/projected/9473d2b0-bd27-437c-8163-34180f007c16-kube-api-access-r7lcp\") pod \"iptables-alerter-zbxjq\" (UID: \"9473d2b0-bd27-437c-8163-34180f007c16\") " pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:06.635739 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.635722 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dskbw\" (UniqueName: \"kubernetes.io/projected/35776b6a-fb9e-462b-9392-ed0451ab2515-kube-api-access-dskbw\") pod \"multus-additional-cni-plugins-twv7g\" (UID: \"35776b6a-fb9e-462b-9392-ed0451ab2515\") " pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.717905 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.717823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:06.724591 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.724564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" Apr 16 13:59:06.732347 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.732318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r275w" Apr 16 13:59:06.736925 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.736900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kh2z9" Apr 16 13:59:06.743486 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.743464 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xt4tf" Apr 16 13:59:06.750053 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.750032 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:06.757584 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.757563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rh5gj" Apr 16 13:59:06.763498 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.763476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-twv7g" Apr 16 13:59:06.772019 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:06.771997 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zbxjq" Apr 16 13:59:07.042839 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.042808 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19493ff7_1060_4a3e_af97_e32119792569.slice/crio-86bdcd1f60c1e8afb00cecdbaf9072af5c0ea6624ec65f4f655620f3f9a8bc52 WatchSource:0}: Error finding container 86bdcd1f60c1e8afb00cecdbaf9072af5c0ea6624ec65f4f655620f3f9a8bc52: Status 404 returned error can't find the container with id 86bdcd1f60c1e8afb00cecdbaf9072af5c0ea6624ec65f4f655620f3f9a8bc52 Apr 16 13:59:07.043786 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.043761 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61299c1a_c4f0_4b6c_9f79_4cddd37b5bcf.slice/crio-49cc3c0453bfabd149b272c1bbb655a3f8d81d910f4d2634c4c798bade066b6c WatchSource:0}: Error finding container 49cc3c0453bfabd149b272c1bbb655a3f8d81d910f4d2634c4c798bade066b6c: Status 404 returned error can't find the container with id 49cc3c0453bfabd149b272c1bbb655a3f8d81d910f4d2634c4c798bade066b6c Apr 16 13:59:07.045149 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.045036 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4089fed_dbf0_4e49_a815_79b762aba862.slice/crio-d0943bc720f89ca414a386ddc9c4180b1b17b441fd34d3b1ca0c0546b25a2b72 WatchSource:0}: Error finding container d0943bc720f89ca414a386ddc9c4180b1b17b441fd34d3b1ca0c0546b25a2b72: Status 404 returned error can't find the container with id d0943bc720f89ca414a386ddc9c4180b1b17b441fd34d3b1ca0c0546b25a2b72 Apr 16 13:59:07.047523 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.047498 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47298b75_037b_4a3e_87aa_d3244b5c60f9.slice/crio-5e1d2fce65f0eb41138218f3e038fa18dec8b6da8ce66632d28d1474819237e0 WatchSource:0}: Error finding container 5e1d2fce65f0eb41138218f3e038fa18dec8b6da8ce66632d28d1474819237e0: Status 404 returned error can't find the container with id 5e1d2fce65f0eb41138218f3e038fa18dec8b6da8ce66632d28d1474819237e0 Apr 16 13:59:07.048181 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.048161 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee74f40d_8d8e_43f7_925f_88a0b6fe4a5d.slice/crio-88540152df8a1acc5012a898c750cc6a49cdd99eb1c0260978b560e0c227a0ed WatchSource:0}: Error finding container 88540152df8a1acc5012a898c750cc6a49cdd99eb1c0260978b560e0c227a0ed: Status 404 returned error can't find the container with id 88540152df8a1acc5012a898c750cc6a49cdd99eb1c0260978b560e0c227a0ed Apr 16 13:59:07.048950 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.048929 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9473d2b0_bd27_437c_8163_34180f007c16.slice/crio-b4b712e2c506a28a0cc8dba26a176307e1e2ff399b47334916b7fb2b29b69c06 WatchSource:0}: Error finding container b4b712e2c506a28a0cc8dba26a176307e1e2ff399b47334916b7fb2b29b69c06: Status 404 returned error can't find the container with id b4b712e2c506a28a0cc8dba26a176307e1e2ff399b47334916b7fb2b29b69c06 Apr 16 13:59:07.049898 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.049880 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca354899_9eed_44c4_8f19_33e699024e89.slice/crio-01e362b1ec429595a6d2ea9b09518ee7953af202def68c61611d4dc66da57c32 WatchSource:0}: Error finding container 01e362b1ec429595a6d2ea9b09518ee7953af202def68c61611d4dc66da57c32: Status 404 returned error can't find the container with id 01e362b1ec429595a6d2ea9b09518ee7953af202def68c61611d4dc66da57c32 Apr 16 13:59:07.052390 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:07.052367 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c7126c_784b_47c6_88ca_c5375d70b493.slice/crio-e9222b1b52708d3955abe1a7754af122b57c14caf76173672ebf563e89bcede1 WatchSource:0}: Error finding container e9222b1b52708d3955abe1a7754af122b57c14caf76173672ebf563e89bcede1: Status 404 returned error can't find the container with id e9222b1b52708d3955abe1a7754af122b57c14caf76173672ebf563e89bcede1 Apr 16 13:59:07.130273 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.130248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:07.130350 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.130288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:07.130384 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.130373 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:07.130426 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.130388 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:07.130426 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.130410 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:07.130426 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.130415 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:08.13040265 +0000 UTC m=+4.174435814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:07.130426 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.130422 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ppfhm for pod openshift-network-diagnostics/network-check-target-cb8c5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:07.130556 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.130473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm podName:c63a949a-1dfc-4b9a-a5f7-ab03e43113fb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:08.130454676 +0000 UTC m=+4.174487855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppfhm" (UniqueName: "kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm") pod "network-check-target-cb8c5" (UID: "c63a949a-1dfc-4b9a-a5f7-ab03e43113fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:07.453531 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.453428 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:05 +0000 UTC" deadline="2027-09-15 11:44:38.364468215 +0000 UTC" Apr 16 13:59:07.453531 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.453469 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12405h45m30.911005459s" Apr 16 13:59:07.512905 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.512871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:07.513087 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:07.513012 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:07.519984 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.519955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kh2z9" event={"ID":"f4089fed-dbf0-4e49-a815-79b762aba862","Type":"ContainerStarted","Data":"d0943bc720f89ca414a386ddc9c4180b1b17b441fd34d3b1ca0c0546b25a2b72"} Apr 16 13:59:07.521181 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.521154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j8v85" event={"ID":"19493ff7-1060-4a3e-af97-e32119792569","Type":"ContainerStarted","Data":"86bdcd1f60c1e8afb00cecdbaf9072af5c0ea6624ec65f4f655620f3f9a8bc52"} Apr 16 13:59:07.523089 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.523048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" event={"ID":"6a28db42b1f5e7c92afb50f5a79e5c07","Type":"ContainerStarted","Data":"6a8dd551879465565e7fc12be32ab0301161e33910931cadab3a2d65763af4d6"} Apr 16 13:59:07.524208 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.524180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" event={"ID":"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf","Type":"ContainerStarted","Data":"49cc3c0453bfabd149b272c1bbb655a3f8d81d910f4d2634c4c798bade066b6c"} Apr 16 13:59:07.525288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.525264 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rh5gj" event={"ID":"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d","Type":"ContainerStarted","Data":"88540152df8a1acc5012a898c750cc6a49cdd99eb1c0260978b560e0c227a0ed"} Apr 16 13:59:07.527046 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.527023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"e9222b1b52708d3955abe1a7754af122b57c14caf76173672ebf563e89bcede1"} Apr 16 13:59:07.528348 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.528322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerStarted","Data":"6f8260cb4ab06cd92aaf20f9130a079d29873a4fe6299ba7a7735d9a198051b8"} Apr 16 13:59:07.529370 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.529317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zbxjq" event={"ID":"9473d2b0-bd27-437c-8163-34180f007c16","Type":"ContainerStarted","Data":"b4b712e2c506a28a0cc8dba26a176307e1e2ff399b47334916b7fb2b29b69c06"} Apr 16 13:59:07.530362 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.530344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xt4tf" event={"ID":"ca354899-9eed-44c4-8f19-33e699024e89","Type":"ContainerStarted","Data":"01e362b1ec429595a6d2ea9b09518ee7953af202def68c61611d4dc66da57c32"} Apr 16 13:59:07.531596 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.531566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r275w" event={"ID":"47298b75-037b-4a3e-87aa-d3244b5c60f9","Type":"ContainerStarted","Data":"5e1d2fce65f0eb41138218f3e038fa18dec8b6da8ce66632d28d1474819237e0"} Apr 16 13:59:07.538216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.538052 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-195.ec2.internal" podStartSLOduration=2.538038169 podStartE2EDuration="2.538038169s" podCreationTimestamp="2026-04-16 13:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:07.537641901 +0000 UTC m=+3.581675088" watchObservedRunningTime="2026-04-16 13:59:07.538038169 +0000 UTC m=+3.582071351" Apr 16 13:59:07.877449 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:07.877371 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:08.138992 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:08.138903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:08.138992 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:08.138973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:08.139206 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.139116 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:08.139206 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.139176 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:10.139157147 +0000 UTC m=+6.183190316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:08.139602 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.139579 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:08.139731 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.139607 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:08.139731 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.139620 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ppfhm for pod openshift-network-diagnostics/network-check-target-cb8c5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:08.139731 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.139680 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm podName:c63a949a-1dfc-4b9a-a5f7-ab03e43113fb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:10.139647387 +0000 UTC m=+6.183680557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppfhm" (UniqueName: "kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm") pod "network-check-target-cb8c5" (UID: "c63a949a-1dfc-4b9a-a5f7-ab03e43113fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:08.514831 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:08.514722 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:08.515284 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:08.514853 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:08.539690 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:08.539328 2573 generic.go:358] "Generic (PLEG): container finished" podID="b944d19d12689cfa5af24924745cf2e9" containerID="2378f16a93862003a4eeb6303011b67fcfd39ae58f70b67e85892118e3321d6d" exitCode=0 Apr 16 13:59:08.539690 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:08.539465 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" event={"ID":"b944d19d12689cfa5af24924745cf2e9","Type":"ContainerDied","Data":"2378f16a93862003a4eeb6303011b67fcfd39ae58f70b67e85892118e3321d6d"} Apr 16 13:59:09.512799 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:09.512766 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:09.513068 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:09.512917 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:09.557685 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:09.556236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" event={"ID":"b944d19d12689cfa5af24924745cf2e9","Type":"ContainerStarted","Data":"6ebb63d041dea3b2ca2f85f2502cc30d66556315d3af7782235c485481724d6f"} Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:10.157264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:10.157327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.157455 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.157515 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:14.157497749 +0000 UTC m=+10.201530919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.157948 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.157967 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.157980 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ppfhm for pod openshift-network-diagnostics/network-check-target-cb8c5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:10.158061 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.158024 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm podName:c63a949a-1dfc-4b9a-a5f7-ab03e43113fb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:14.158008945 +0000 UTC m=+10.202042113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppfhm" (UniqueName: "kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm") pod "network-check-target-cb8c5" (UID: "c63a949a-1dfc-4b9a-a5f7-ab03e43113fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:10.514465 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:10.514195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:10.514465 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:10.514321 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:11.512883 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:11.512852 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:11.513339 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:11.512995 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:12.512617 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:12.512577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:12.512812 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:12.512737 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:13.512302 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:13.512264 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:13.512755 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:13.512440 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:14.192558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:14.192639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.192817 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.192837 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.192850 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ppfhm for pod openshift-network-diagnostics/network-check-target-cb8c5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.192907 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm podName:c63a949a-1dfc-4b9a-a5f7-ab03e43113fb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.192888845 +0000 UTC m=+18.236922010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppfhm" (UniqueName: "kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm") pod "network-check-target-cb8c5" (UID: "c63a949a-1dfc-4b9a-a5f7-ab03e43113fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.192981 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:14.193091 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.193029 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.193015958 +0000 UTC m=+18.237049136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:14.514774 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:14.513885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:14.514774 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:14.513992 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:15.512042 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:15.512004 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:15.512268 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:15.512145 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:16.512974 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:16.512923 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:16.513425 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:16.513056 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:17.512200 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:17.512163 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:17.512385 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:17.512307 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:18.512451 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:18.512418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:18.512872 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:18.512538 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:19.512058 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:19.512028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:19.512330 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:19.512134 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:20.512177 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:20.512139 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:20.512685 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:20.512288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:21.512573 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.512540 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:21.513027 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:21.512702 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:21.553596 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.553544 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-195.ec2.internal" podStartSLOduration=16.553525555 podStartE2EDuration="16.553525555s" podCreationTimestamp="2026-04-16 13:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:09.575273118 +0000 UTC m=+5.619306306" watchObservedRunningTime="2026-04-16 13:59:21.553525555 +0000 UTC m=+17.597558743" Apr 16 13:59:21.554168 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.554141 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-l2l88"] Apr 16 13:59:21.650902 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.650872 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.651079 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:21.650975 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:21.751060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.751006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0c5b0215-6e48-4512-a0fa-d432021b128c-kubelet-config\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.751259 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.751068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.751259 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.751095 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0c5b0215-6e48-4512-a0fa-d432021b128c-dbus\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.852113 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.852072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0c5b0215-6e48-4512-a0fa-d432021b128c-kubelet-config\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.852280 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.852135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.852280 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.852162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0c5b0215-6e48-4512-a0fa-d432021b128c-dbus\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.852280 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.852202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0c5b0215-6e48-4512-a0fa-d432021b128c-kubelet-config\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:21.852454 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:21.852301 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:21.852454 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:21.852364 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret podName:0c5b0215-6e48-4512-a0fa-d432021b128c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.352345619 +0000 UTC m=+18.396378793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret") pod "global-pull-secret-syncer-l2l88" (UID: "0c5b0215-6e48-4512-a0fa-d432021b128c") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:21.852454 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:21.852387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0c5b0215-6e48-4512-a0fa-d432021b128c-dbus\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:22.254896 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:22.254824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:22.254896 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:22.254872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:22.255054 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.254961 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.255054 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.254979 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:22.255054 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.254997 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:22.255054 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.255008 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ppfhm for pod openshift-network-diagnostics/network-check-target-cb8c5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:22.255054 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.255019 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.25500114 +0000 UTC m=+34.299034303 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.255054 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.255049 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm podName:c63a949a-1dfc-4b9a-a5f7-ab03e43113fb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.255036344 +0000 UTC m=+34.299069512 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppfhm" (UniqueName: "kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm") pod "network-check-target-cb8c5" (UID: "c63a949a-1dfc-4b9a-a5f7-ab03e43113fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:22.355362 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:22.355316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:22.355507 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.355466 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:22.355598 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.355533 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret podName:0c5b0215-6e48-4512-a0fa-d432021b128c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.355514059 +0000 UTC m=+19.399547222 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret") pod "global-pull-secret-syncer-l2l88" (UID: "0c5b0215-6e48-4512-a0fa-d432021b128c") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:22.512234 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:22.512151 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:22.512379 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:22.512268 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:23.362115 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:23.362075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:23.362565 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:23.362234 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:23.362565 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:23.362308 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret podName:0c5b0215-6e48-4512-a0fa-d432021b128c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.362288337 +0000 UTC m=+21.406321501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret") pod "global-pull-secret-syncer-l2l88" (UID: "0c5b0215-6e48-4512-a0fa-d432021b128c") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:23.513058 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:23.513021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:23.513242 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:23.513060 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:23.513242 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:23.513143 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:23.513360 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:23.513284 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:24.513460 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.513333 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:24.514257 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:24.513546 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:24.585538 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.585415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rh5gj" event={"ID":"ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d","Type":"ContainerStarted","Data":"776e501dd96cffdda783427e71338fe132c84ebb785d3b8979969f2e86bf2207"} Apr 16 13:59:24.587161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.587129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"e0608ce4a0b0066d704605045c93c9c8b707b9611e7aa43886e29f1700b10e3c"} Apr 16 13:59:24.587161 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.587158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"90c3a729e29d9ae439f912d2786571a0e99739ab8b0f4b06d0933d5c691ca1a7"} Apr 16 13:59:24.588489 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.588462 2573 generic.go:358] "Generic (PLEG): container finished" podID="35776b6a-fb9e-462b-9392-ed0451ab2515" containerID="0d05bb6e0ff7e886a15c1edf46b84725b178dd9855231bdc2fa73192b3571f38" exitCode=0 Apr 16 13:59:24.588579 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.588495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerDied","Data":"0d05bb6e0ff7e886a15c1edf46b84725b178dd9855231bdc2fa73192b3571f38"} Apr 16 13:59:24.590610 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.590376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xt4tf" event={"ID":"ca354899-9eed-44c4-8f19-33e699024e89","Type":"ContainerStarted","Data":"4bd45d0d84b68751137427caeb50bcce6bdc12272d4afb2c0a3ab86f3d62d65f"} Apr 16 13:59:24.593588 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.593565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r275w" event={"ID":"47298b75-037b-4a3e-87aa-d3244b5c60f9","Type":"ContainerStarted","Data":"ea194b55a332a41d72f1f77701bea38363406c6a952cdc50e6525956d5b4b88c"} Apr 16 13:59:24.595150 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.595125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kh2z9" event={"ID":"f4089fed-dbf0-4e49-a815-79b762aba862","Type":"ContainerStarted","Data":"905bc430293a8dad3102beec64cce05a2a27e5e1ce2ab201d7b61de1fc9afe35"} Apr 16 13:59:24.596420 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.596398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j8v85" event={"ID":"19493ff7-1060-4a3e-af97-e32119792569","Type":"ContainerStarted","Data":"f444c97fc265b4c1237afba28c83f00e8bc96ead9c3075433259a530ed4a012c"} Apr 16 13:59:24.597644 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.597623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" event={"ID":"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf","Type":"ContainerStarted","Data":"3a9e88f73b5d2594cdebbdeccc11f1a4334a37a31c5a0616f43a0a1b627cd691"} Apr 16 13:59:24.627885 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.627842 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rh5gj" podStartSLOduration=3.917530926 podStartE2EDuration="20.627828966s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.049893693 +0000 UTC m=+3.093926857" lastFinishedPulling="2026-04-16 13:59:23.760191718 +0000 UTC m=+19.804224897" observedRunningTime="2026-04-16 13:59:24.611286218 +0000 UTC m=+20.655319401" watchObservedRunningTime="2026-04-16 13:59:24.627828966 +0000 UTC m=+20.671862152" Apr 16 13:59:24.628217 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.628193 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kh2z9" podStartSLOduration=11.645453046 podStartE2EDuration="20.628188233s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.046832976 +0000 UTC m=+3.090866139" lastFinishedPulling="2026-04-16 13:59:16.029568163 +0000 UTC m=+12.073601326" observedRunningTime="2026-04-16 13:59:24.627585513 +0000 UTC m=+20.671618700" watchObservedRunningTime="2026-04-16 13:59:24.628188233 +0000 UTC m=+20.672221419" Apr 16 13:59:24.644793 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.644653 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xt4tf" podStartSLOduration=3.569089326 podStartE2EDuration="20.644638063s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.05490803 +0000 UTC m=+3.098941197" lastFinishedPulling="2026-04-16 13:59:24.130456761 +0000 UTC m=+20.174489934" observedRunningTime="2026-04-16 13:59:24.643936267 +0000 UTC m=+20.687969455" watchObservedRunningTime="2026-04-16 13:59:24.644638063 +0000 UTC m=+20.688671249" Apr 16 13:59:24.676935 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:24.676894 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r275w" podStartSLOduration=3.601622145 podStartE2EDuration="20.67687814s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.04918334 +0000 UTC m=+3.093216514" lastFinishedPulling="2026-04-16 13:59:24.124439332 +0000 UTC m=+20.168472509" observedRunningTime="2026-04-16 13:59:24.673382749 +0000 UTC m=+20.717415935" watchObservedRunningTime="2026-04-16 13:59:24.67687814 +0000 UTC m=+20.720911328" Apr 16 13:59:25.378804 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.378771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:25.378984 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:25.378891 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:25.378984 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:25.378942 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret podName:0c5b0215-6e48-4512-a0fa-d432021b128c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.378929672 +0000 UTC m=+25.422962836 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret") pod "global-pull-secret-syncer-l2l88" (UID: "0c5b0215-6e48-4512-a0fa-d432021b128c") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:25.512796 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.512764 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:25.512966 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.512769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:25.512966 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:25.512867 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:25.512966 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:25.512953 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:25.602713 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.602686 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 13:59:25.603154 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.603058 2573 generic.go:358] "Generic (PLEG): container finished" podID="a3c7126c-784b-47c6-88ca-c5375d70b493" containerID="e0608ce4a0b0066d704605045c93c9c8b707b9611e7aa43886e29f1700b10e3c" exitCode=1 Apr 16 13:59:25.603154 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.603095 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerDied","Data":"e0608ce4a0b0066d704605045c93c9c8b707b9611e7aa43886e29f1700b10e3c"} Apr 16 13:59:25.603154 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.603141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"bee8db2f732cea65f52611e7459a4cfa59491947b309690aa9c1ba6f5b34e71e"} Apr 16 13:59:25.603154 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.603157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"3f2149d77c9fc2ed69d19fe24f9419d56a140518a8aec92da43329988c3b43e6"} Apr 16 13:59:25.603341 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.603169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"36811cc1a6e61ad4d2ca7201a88d7245c7273630866493ad4a95f493ef5ebbb1"} Apr 16 13:59:25.603341 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:25.603182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"862786c49386e42ae63d06b4d8d5edc8c6db551837bc8df75a866ed5c03fc84f"} Apr 16 13:59:26.512904 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:26.512872 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:26.513123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:26.512997 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:26.606417 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:26.606378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zbxjq" event={"ID":"9473d2b0-bd27-437c-8163-34180f007c16","Type":"ContainerStarted","Data":"a62fff0fa30a0aef14e152282ff7b2913438271a9f079c54d7fc45374c7d92ea"} Apr 16 13:59:26.622691 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:26.622621 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zbxjq" podStartSLOduration=5.9131431150000004 podStartE2EDuration="22.622603168s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.050744537 +0000 UTC m=+3.094777701" lastFinishedPulling="2026-04-16 13:59:23.760204578 +0000 UTC m=+19.804237754" observedRunningTime="2026-04-16 13:59:26.622416614 +0000 UTC m=+22.666449801" watchObservedRunningTime="2026-04-16 13:59:26.622603168 +0000 UTC m=+22.666636360" Apr 16 13:59:26.623363 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:26.623321 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j8v85" podStartSLOduration=5.908416095 podStartE2EDuration="22.623310589s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.045343757 +0000 UTC m=+3.089376931" lastFinishedPulling="2026-04-16 13:59:23.760238262 +0000 UTC m=+19.804271425" observedRunningTime="2026-04-16 13:59:24.728864897 +0000 UTC m=+20.772898083" watchObservedRunningTime="2026-04-16 13:59:26.623310589 +0000 UTC m=+22.667343777" Apr 16 13:59:26.702188 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:26.702008 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:27.482942 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.482802 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:26.702180371Z","UUID":"ba60dd0e-3e46-478f-a6ce-3a313e50d2a1","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:27.486316 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.486293 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:27.486316 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.486322 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:27.512181 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.512149 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:27.512423 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.512190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:27.512423 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:27.512318 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:27.512608 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:27.512487 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:27.610750 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.610710 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" event={"ID":"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf","Type":"ContainerStarted","Data":"f961d7849f308e2451f95b6311bc41cbcddac3aec8e2199bd3a1fc5f1cdfc5f8"} Apr 16 13:59:27.614112 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.614087 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 13:59:27.614552 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:27.614524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"aa8d6a4972dd4fc712d71f2f1ea76280cf1cd2418be07a5273b6e8622ffbb171"} Apr 16 13:59:28.454680 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:28.454635 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:28.455375 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:28.455347 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:28.512634 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:28.512602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:28.512827 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:28.512727 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:28.616939 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:28.616910 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:28.617477 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:28.617430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j8v85" Apr 16 13:59:29.410485 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.410305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:29.410614 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:29.410440 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:29.410614 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:29.410571 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret podName:0c5b0215-6e48-4512-a0fa-d432021b128c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.410552565 +0000 UTC m=+33.454585730 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret") pod "global-pull-secret-syncer-l2l88" (UID: "0c5b0215-6e48-4512-a0fa-d432021b128c") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:29.512467 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.512372 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:29.512467 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.512400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:29.512622 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:29.512489 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:29.512622 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:29.512591 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:29.620535 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.620497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" event={"ID":"61299c1a-c4f0-4b6c-9f79-4cddd37b5bcf","Type":"ContainerStarted","Data":"23ecc33ab2343e7e59b79165c888c005504ccbaa6f03388cf226a7c074c9556a"} Apr 16 13:59:29.623316 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.623297 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 13:59:29.623578 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.623559 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"b201a84efc4dba17b043587f247db5e24fcf6e737da71bfcca19f8cffbc54bc2"} Apr 16 13:59:29.623904 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.623881 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:29.624108 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.624091 2573 scope.go:117] "RemoveContainer" containerID="e0608ce4a0b0066d704605045c93c9c8b707b9611e7aa43886e29f1700b10e3c" Apr 16 13:59:29.625296 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.625271 2573 generic.go:358] "Generic (PLEG): container finished" podID="35776b6a-fb9e-462b-9392-ed0451ab2515" containerID="55fe471329a464b0a0236b324ad1a325b9f4f2eb67a660a965d232c8105c20db" exitCode=0 Apr 16 13:59:29.625379 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.625357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerDied","Data":"55fe471329a464b0a0236b324ad1a325b9f4f2eb67a660a965d232c8105c20db"} Apr 16 13:59:29.637413 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.637373 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qvzv2" podStartSLOduration=3.438550658 podStartE2EDuration="25.637361659s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.046503743 +0000 UTC m=+3.090536919" lastFinishedPulling="2026-04-16 13:59:29.245314751 +0000 UTC m=+25.289347920" observedRunningTime="2026-04-16 13:59:29.637005887 +0000 UTC m=+25.681039072" watchObservedRunningTime="2026-04-16 13:59:29.637361659 +0000 UTC m=+25.681394849" Apr 16 13:59:29.640333 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:29.640312 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:30.520906 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.520591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:30.520906 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:30.520753 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:30.634855 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.634834 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 13:59:30.635171 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.635144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" event={"ID":"a3c7126c-784b-47c6-88ca-c5375d70b493","Type":"ContainerStarted","Data":"3ed2edae43d2a49ba9a45eb68fd6fde5ca4f82d25c01675eef5718e5524b072f"} Apr 16 13:59:30.635297 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.635283 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:30.635416 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.635395 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:30.637060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.637033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerStarted","Data":"287124c0cd0f9f41a4a657d41612ae7c9b327c3b5d60cdb772e01ea363d745a0"} Apr 16 13:59:30.649640 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.649621 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:30.662912 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:30.662867 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" podStartSLOduration=9.532803703 podStartE2EDuration="26.662855461s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.05607316 +0000 UTC m=+3.100106324" lastFinishedPulling="2026-04-16 13:59:24.186124907 +0000 UTC m=+20.230158082" observedRunningTime="2026-04-16 13:59:30.66254811 +0000 UTC m=+26.706581295" watchObservedRunningTime="2026-04-16 13:59:30.662855461 +0000 UTC m=+26.706888648" Apr 16 13:59:31.246273 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.246237 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l2l88"] Apr 16 13:59:31.246456 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.246366 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:31.246517 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:31.246471 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:31.247727 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.247703 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-59j5c"] Apr 16 13:59:31.247818 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.247797 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:31.247888 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:31.247870 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:31.251227 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.251205 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cb8c5"] Apr 16 13:59:31.251315 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.251291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:31.251387 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:31.251371 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:31.640548 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.640515 2573 generic.go:358] "Generic (PLEG): container finished" podID="35776b6a-fb9e-462b-9392-ed0451ab2515" containerID="287124c0cd0f9f41a4a657d41612ae7c9b327c3b5d60cdb772e01ea363d745a0" exitCode=0 Apr 16 13:59:31.640922 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.640589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerDied","Data":"287124c0cd0f9f41a4a657d41612ae7c9b327c3b5d60cdb772e01ea363d745a0"} Apr 16 13:59:31.640922 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:31.640770 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:32.512422 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:32.512341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:32.512422 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:32.512382 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:32.512597 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:32.512459 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:32.512597 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:32.512559 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:32.644676 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:32.644621 2573 generic.go:358] "Generic (PLEG): container finished" podID="35776b6a-fb9e-462b-9392-ed0451ab2515" containerID="9443fcc874cdeb7b57111cb8b193a1b3b9f04fbf829f6d1bccb1aabd763fb644" exitCode=0 Apr 16 13:59:32.645009 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:32.644699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerDied","Data":"9443fcc874cdeb7b57111cb8b193a1b3b9f04fbf829f6d1bccb1aabd763fb644"} Apr 16 13:59:32.645045 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:32.645010 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:32.926882 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:32.926839 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 13:59:33.512929 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:33.512895 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:33.513104 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:33.513007 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:34.513208 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:34.513001 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:34.513788 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:34.513061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:34.513788 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:34.513347 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:34.513788 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:34.513483 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:35.513078 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:35.512868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:35.513078 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:35.512989 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb8c5" podUID="c63a949a-1dfc-4b9a-a5f7-ab03e43113fb" Apr 16 13:59:36.512454 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:36.512416 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:36.512657 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:36.512437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:36.512657 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:36.512554 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59j5c" podUID="45ca9ed1-9528-4529-8ffc-64027bd9e40a" Apr 16 13:59:36.512657 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:36.512591 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-l2l88" podUID="0c5b0215-6e48-4512-a0fa-d432021b128c" Apr 16 13:59:37.233279 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.233246 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-195.ec2.internal" event="NodeReady" Apr 16 13:59:37.233852 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.233409 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:37.265934 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.265900 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9dc6b545d-grd2w"] Apr 16 13:59:37.270186 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.270166 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.273216 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.273187 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hnvqt\"" Apr 16 13:59:37.273460 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.273443 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:59:37.273460 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.273452 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:59:37.273943 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.273924 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:59:37.280170 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.280027 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:59:37.281086 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.281049 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9dc6b545d-grd2w"] Apr 16 13:59:37.281907 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.281885 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q"] Apr 16 13:59:37.286095 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.285952 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-h9k6c"] Apr 16 13:59:37.286481 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.286451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" Apr 16 13:59:37.291170 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.290762 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-dfb7h\"" Apr 16 13:59:37.292583 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.292119 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.292583 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.292286 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.294417 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.294400 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh"] Apr 16 13:59:37.294652 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.294637 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.297316 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.297144 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d"] Apr 16 13:59:37.297396 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.297335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:37.298488 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.298187 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-dvwqs\"" Apr 16 13:59:37.298652 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.298632 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.299416 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.298905 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.299416 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.299110 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 13:59:37.299614 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.299465 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 13:59:37.301364 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.300936 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn"] Apr 16 13:59:37.301364 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.301038 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.302069 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.301626 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 13:59:37.302069 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.301933 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.302283 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.302260 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qm7nj\"" Apr 16 13:59:37.302369 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.302297 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.304371 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.304353 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9"] Apr 16 13:59:37.304549 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.304515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.305113 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.305095 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.305207 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.305137 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 13:59:37.305207 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.304657 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 13:59:37.305624 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.305609 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.306450 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.306309 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-x5tv7\"" Apr 16 13:59:37.306450 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.306324 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 13:59:37.309634 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.309339 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 13:59:37.309754 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.309662 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 13:59:37.311102 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.309816 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.311102 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.310049 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2bx5n\"" Apr 16 13:59:37.311102 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.310348 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.311102 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.311034 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-56b5bd7874-fvchd"] Apr 16 13:59:37.311515 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.311494 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.314902 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.314883 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.315072 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.315053 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-8bc5k\"" Apr 16 13:59:37.315279 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.314948 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 13:59:37.315407 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.315385 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 13:59:37.315970 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.315953 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.318127 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.318110 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-c2xv7"] Apr 16 13:59:37.318406 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.318388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.321170 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.321146 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.321287 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.321150 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 13:59:37.321370 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.321188 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rtnsm\"" Apr 16 13:59:37.322104 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.322079 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.322295 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.322188 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 13:59:37.322295 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.322243 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 13:59:37.322484 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.322384 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd"] Apr 16 13:59:37.322577 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.322560 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 13:59:37.322976 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.322959 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.327318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.327299 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.327948 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.327925 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 13:59:37.329206 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.328071 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xkbv9\"" Apr 16 13:59:37.329206 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.328198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 13:59:37.329206 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.328231 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.330894 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.330515 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr"] Apr 16 13:59:37.331462 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.331441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:37.334239 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.334219 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 13:59:37.334355 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.334235 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 13:59:37.334855 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.334836 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mrpkg\"" Apr 16 13:59:37.335395 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.335375 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 13:59:37.335490 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.335425 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kqzcz"] Apr 16 13:59:37.335744 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.335727 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" Apr 16 13:59:37.337969 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.337946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pdrr6\"" Apr 16 13:59:37.337969 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.337961 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.338253 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338034 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.338345 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338328 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q"] Apr 16 13:59:37.338476 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338460 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338499 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338510 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kqzcz"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338520 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-c2xv7"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56b5bd7874-fvchd"] Apr 16 13:59:37.338557 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338462 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:37.338876 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338550 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-h9k6c"] Apr 16 13:59:37.338919 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.338909 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr"] Apr 16 13:59:37.339038 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.339019 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd"] Apr 16 13:59:37.340804 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.340787 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:37.340962 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.340854 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:37.341108 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.341090 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:37.341168 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.341133 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gjmb2\"" Apr 16 13:59:37.368759 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7890347-e1d0-4b03-8b89-c727fa4c4f18-ca-trust-extracted\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.368759 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-image-registry-private-configuration\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.368935 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdnt\" (UniqueName: \"kubernetes.io/projected/943fa4f4-07e5-4de7-9313-95a7a017b304-kube-api-access-hcdnt\") pod \"volume-data-source-validator-7d955d5dd4-gm66q\" (UID: \"943fa4f4-07e5-4de7-9313-95a7a017b304\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" Apr 16 13:59:37.368935 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jxnd\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-kube-api-access-6jxnd\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.368935 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-installation-pull-secrets\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.369059 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-bound-sa-token\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.369059 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.368985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-trusted-ca\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.369059 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.369014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.369059 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.369045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-certificates\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.384530 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.384506 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8ptvc"] Apr 16 13:59:37.388364 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.388344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.391054 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.391034 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:37.391054 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.391049 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-pfkw7\"" Apr 16 13:59:37.391283 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.391045 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:37.395103 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.395084 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8ptvc"] Apr 16 13:59:37.470169 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0191787a-6511-4245-a250-5fe459bf077c-serving-cert\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.470169 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-default-certificate\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdl6l\" (UniqueName: \"kubernetes.io/projected/8f6b46fd-2e36-47ee-945a-f745461401d6-kube-api-access-wdl6l\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e37e261-0537-4203-b8fd-2b1189bee139-config-volume\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-serving-cert\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6slf4\" (UniqueName: \"kubernetes.io/projected/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-kube-api-access-6slf4\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.470418 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027223a7-23fc-4750-bd60-a91d4fbb3300-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.470478 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.470481 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret podName:0c5b0215-6e48-4512-a0fa-d432021b128c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.470460909 +0000 UTC m=+49.514494074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret") pod "global-pull-secret-syncer-l2l88" (UID: "0c5b0215-6e48-4512-a0fa-d432021b128c") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7rq\" (UniqueName: \"kubernetes.io/projected/51685b01-c9b4-47f8-98e7-5e19c0d32502-kube-api-access-tw7rq\") pod \"network-check-source-7b678d77c7-55pdr\" (UID: \"51685b01-c9b4-47f8-98e7-5e19c0d32502\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-bound-sa-token\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-snapshots\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vlfb\" (UniqueName: \"kubernetes.io/projected/66aa1d26-c320-4759-bcd7-99678d388133-kube-api-access-7vlfb\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470883 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4vb\" (UniqueName: \"kubernetes.io/projected/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-kube-api-access-9v4vb\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.470913 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxss8\" (UniqueName: \"kubernetes.io/projected/2e37e261-0537-4203-b8fd-2b1189bee139-kube-api-access-gxss8\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.470978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-certificates\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6b46fd-2e36-47ee-945a-f745461401d6-serving-cert\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-stats-auth\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/60e08263-d0bc-469a-b7ae-b83c965fa7a3-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.471079 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e37e261-0537-4203-b8fd-2b1189bee139-tmp-dir\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.471098 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9dc6b545d-grd2w: secret "image-registry-tls" not found Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.471142 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls podName:d7890347-e1d0-4b03-8b89-c727fa4c4f18 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.971126619 +0000 UTC m=+34.015159786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls") pod "image-registry-9dc6b545d-grd2w" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18") : secret "image-registry-tls" not found Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6b46fd-2e36-47ee-945a-f745461401d6-config\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdnt\" (UniqueName: \"kubernetes.io/projected/943fa4f4-07e5-4de7-9313-95a7a017b304-kube-api-access-hcdnt\") pod \"volume-data-source-validator-7d955d5dd4-gm66q\" (UID: \"943fa4f4-07e5-4de7-9313-95a7a017b304\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7890347-e1d0-4b03-8b89-c727fa4c4f18-ca-trust-extracted\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471285 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/027223a7-23fc-4750-bd60-a91d4fbb3300-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-image-registry-private-configuration\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rkxj\" (UniqueName: \"kubernetes.io/projected/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-kube-api-access-7rkxj\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmzq\" (UniqueName: \"kubernetes.io/projected/027223a7-23fc-4750-bd60-a91d4fbb3300-kube-api-access-nsmzq\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jxnd\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-kube-api-access-6jxnd\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-installation-pull-secrets\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4m8n\" (UniqueName: \"kubernetes.io/projected/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-kube-api-access-z4m8n\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471535 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0191787a-6511-4245-a250-5fe459bf077c-trusted-ca\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-trusted-ca\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-certificates\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgsgv\" (UniqueName: \"kubernetes.io/projected/0191787a-6511-4245-a250-5fe459bf077c-kube-api-access-tgsgv\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471696 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-tmp\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471729 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0191787a-6511-4245-a250-5fe459bf077c-config\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.471899 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.472586 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.472586 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.471827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:37.472586 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.472209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7890347-e1d0-4b03-8b89-c727fa4c4f18-ca-trust-extracted\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.473017 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.472994 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-trusted-ca\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.476157 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.476114 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-image-registry-private-configuration\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.476157 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.476140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-installation-pull-secrets\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.481977 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.481056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-bound-sa-token\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.481977 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.481172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jxnd\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-kube-api-access-6jxnd\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.481977 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.481964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdnt\" (UniqueName: \"kubernetes.io/projected/943fa4f4-07e5-4de7-9313-95a7a017b304-kube-api-access-hcdnt\") pod \"volume-data-source-validator-7d955d5dd4-gm66q\" (UID: \"943fa4f4-07e5-4de7-9313-95a7a017b304\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" Apr 16 13:59:37.512056 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.511986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:37.514927 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.514906 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcmqf\"" Apr 16 13:59:37.573192 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:37.573192 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0191787a-6511-4245-a250-5fe459bf077c-serving-cert\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.573435 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.573316 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:37.573435 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:37.573435 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-default-certificate\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.573435 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.573396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert podName:60e08263-d0bc-469a-b7ae-b83c965fa7a3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.073372817 +0000 UTC m=+34.117405980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6gtrd" (UID: "60e08263-d0bc-469a-b7ae-b83c965fa7a3") : secret "networking-console-plugin-cert" not found Apr 16 13:59:37.573435 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdl6l\" (UniqueName: \"kubernetes.io/projected/8f6b46fd-2e36-47ee-945a-f745461401d6-kube-api-access-wdl6l\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e37e261-0537-4203-b8fd-2b1189bee139-config-volume\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.573468 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-serving-cert\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.573524 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls podName:66aa1d26-c320-4759-bcd7-99678d388133 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.073510158 +0000 UTC m=+34.117543337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls") pod "cluster-samples-operator-667775844f-62hlh" (UID: "66aa1d26-c320-4759-bcd7-99678d388133") : secret "samples-operator-tls" not found Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6slf4\" (UniqueName: \"kubernetes.io/projected/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-kube-api-access-6slf4\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027223a7-23fc-4750-bd60-a91d4fbb3300-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7rq\" (UniqueName: \"kubernetes.io/projected/51685b01-c9b4-47f8-98e7-5e19c0d32502-kube-api-access-tw7rq\") pod \"network-check-source-7b678d77c7-55pdr\" (UID: \"51685b01-c9b4-47f8-98e7-5e19c0d32502\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" Apr 16 13:59:37.573756 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-snapshots\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573776 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vlfb\" (UniqueName: \"kubernetes.io/projected/66aa1d26-c320-4759-bcd7-99678d388133-kube-api-access-7vlfb\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4vb\" (UniqueName: \"kubernetes.io/projected/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-kube-api-access-9v4vb\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxss8\" (UniqueName: \"kubernetes.io/projected/2e37e261-0537-4203-b8fd-2b1189bee139-kube-api-access-gxss8\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6b46fd-2e36-47ee-945a-f745461401d6-serving-cert\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.573915 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-stats-auth\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.573963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/60e08263-d0bc-469a-b7ae-b83c965fa7a3-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.573979 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert podName:a8fbe668-dcf2-4f5d-97a8-f28c95ce8261 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.073961497 +0000 UTC m=+34.117994661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert") pod "ingress-canary-kqzcz" (UID: "a8fbe668-dcf2-4f5d-97a8-f28c95ce8261") : secret "canary-serving-cert" not found Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e37e261-0537-4203-b8fd-2b1189bee139-tmp-dir\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6b46fd-2e36-47ee-945a-f745461401d6-config\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574114 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e37e261-0537-4203-b8fd-2b1189bee139-config-volume\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/027223a7-23fc-4750-bd60-a91d4fbb3300-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.574288 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.574171 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.074151904 +0000 UTC m=+34.118185135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : configmap references non-existent config key: service-ca.crt Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rkxj\" (UniqueName: \"kubernetes.io/projected/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-kube-api-access-7rkxj\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmzq\" (UniqueName: \"kubernetes.io/projected/027223a7-23fc-4750-bd60-a91d4fbb3300-kube-api-access-nsmzq\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4m8n\" (UniqueName: \"kubernetes.io/projected/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-kube-api-access-z4m8n\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0191787a-6511-4245-a250-5fe459bf077c-trusted-ca\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027223a7-23fc-4750-bd60-a91d4fbb3300-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgsgv\" (UniqueName: \"kubernetes.io/projected/0191787a-6511-4245-a250-5fe459bf077c-kube-api-access-tgsgv\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-tmp\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0191787a-6511-4245-a250-5fe459bf077c-config\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.575060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.574975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/60e08263-d0bc-469a-b7ae-b83c965fa7a3-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:37.575691 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.575143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.575691 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.575446 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:37.575691 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.575502 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls podName:2e37e261-0537-4203-b8fd-2b1189bee139 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.075485441 +0000 UTC m=+34.119518606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls") pod "dns-default-8ptvc" (UID: "2e37e261-0537-4203-b8fd-2b1189bee139") : secret "dns-default-metrics-tls" not found Apr 16 13:59:37.575691 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.575536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.575691 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.575596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e37e261-0537-4203-b8fd-2b1189bee139-tmp-dir\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.575954 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.575935 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:59:37.576008 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.575979 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.075965789 +0000 UTC m=+34.119998970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : secret "router-metrics-certs-default" not found Apr 16 13:59:37.576169 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.576143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6b46fd-2e36-47ee-945a-f745461401d6-config\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.576278 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.576249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-tmp\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.576398 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.576323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0191787a-6511-4245-a250-5fe459bf077c-config\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.576398 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.576343 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:37.576398 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.576341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0191787a-6511-4245-a250-5fe459bf077c-trusted-ca\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.576398 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.576343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-snapshots\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.576398 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.576386 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls podName:9f9e0f27-a5f3-44ee-9f24-aec06b0aa130 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.076373085 +0000 UTC m=+34.120406267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ngv2d" (UID: "9f9e0f27-a5f3-44ee-9f24-aec06b0aa130") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:37.576655 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.576612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0191787a-6511-4245-a250-5fe459bf077c-serving-cert\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.577606 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.577583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-serving-cert\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.578061 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.578024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/027223a7-23fc-4750-bd60-a91d4fbb3300-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.578565 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.578536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6b46fd-2e36-47ee-945a-f745461401d6-serving-cert\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.579410 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.579387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-stats-auth\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.580507 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.580465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-default-certificate\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.583173 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.582568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6slf4\" (UniqueName: \"kubernetes.io/projected/26ed3c5b-e875-42d2-b496-1c3aacfc5b95-kube-api-access-6slf4\") pod \"insights-operator-5785d4fcdd-c2xv7\" (UID: \"26ed3c5b-e875-42d2-b496-1c3aacfc5b95\") " pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.583173 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.582705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdl6l\" (UniqueName: \"kubernetes.io/projected/8f6b46fd-2e36-47ee-945a-f745461401d6-kube-api-access-wdl6l\") pod \"service-ca-operator-69965bb79d-5rth9\" (UID: \"8f6b46fd-2e36-47ee-945a-f745461401d6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.586106 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.586037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmzq\" (UniqueName: \"kubernetes.io/projected/027223a7-23fc-4750-bd60-a91d4fbb3300-kube-api-access-nsmzq\") pod \"kube-storage-version-migrator-operator-756bb7d76f-845gn\" (UID: \"027223a7-23fc-4750-bd60-a91d4fbb3300\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.586106 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.586064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxss8\" (UniqueName: \"kubernetes.io/projected/2e37e261-0537-4203-b8fd-2b1189bee139-kube-api-access-gxss8\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:37.586826 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.586802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4m8n\" (UniqueName: \"kubernetes.io/projected/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-kube-api-access-z4m8n\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:37.587348 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.587308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4vb\" (UniqueName: \"kubernetes.io/projected/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-kube-api-access-9v4vb\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:37.587436 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.587409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgsgv\" (UniqueName: \"kubernetes.io/projected/0191787a-6511-4245-a250-5fe459bf077c-kube-api-access-tgsgv\") pod \"console-operator-d87b8d5fc-h9k6c\" (UID: \"0191787a-6511-4245-a250-5fe459bf077c\") " pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.587493 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.587433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rkxj\" (UniqueName: \"kubernetes.io/projected/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-kube-api-access-7rkxj\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:37.587963 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.587932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7rq\" (UniqueName: \"kubernetes.io/projected/51685b01-c9b4-47f8-98e7-5e19c0d32502-kube-api-access-tw7rq\") pod \"network-check-source-7b678d77c7-55pdr\" (UID: \"51685b01-c9b4-47f8-98e7-5e19c0d32502\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" Apr 16 13:59:37.588090 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.588071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vlfb\" (UniqueName: \"kubernetes.io/projected/66aa1d26-c320-4759-bcd7-99678d388133-kube-api-access-7vlfb\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:37.602891 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.602867 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" Apr 16 13:59:37.613641 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.613619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:37.644603 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.644549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" Apr 16 13:59:37.651296 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.651276 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" Apr 16 13:59:37.665398 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.665371 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" Apr 16 13:59:37.681322 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.681295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" Apr 16 13:59:37.979248 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:37.979211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:37.979436 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.979371 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:37.979436 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.979394 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9dc6b545d-grd2w: secret "image-registry-tls" not found Apr 16 13:59:37.979528 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:37.979478 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls podName:d7890347-e1d0-4b03-8b89-c727fa4c4f18 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.979458762 +0000 UTC m=+35.023491952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls") pod "image-registry-9dc6b545d-grd2w" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18") : secret "image-registry-tls" not found Apr 16 13:59:38.080630 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:38.080838 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:38.080838 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:38.080838 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080773 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:38.080838 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:38.080838 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080821 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.080793982 +0000 UTC m=+35.124827147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : configmap references non-existent config key: service-ca.crt Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080859 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert podName:a8fbe668-dcf2-4f5d-97a8-f28c95ce8261 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.080846502 +0000 UTC m=+35.124879666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert") pod "ingress-canary-kqzcz" (UID: "a8fbe668-dcf2-4f5d-97a8-f28c95ce8261") : secret "canary-serving-cert" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080878 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080888 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080924 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls podName:2e37e261-0537-4203-b8fd-2b1189bee139 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.080907292 +0000 UTC m=+35.124940462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls") pod "dns-default-8ptvc" (UID: "2e37e261-0537-4203-b8fd-2b1189bee139") : secret "dns-default-metrics-tls" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080944 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.080937464 +0000 UTC m=+35.124970627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : secret "router-metrics-certs-default" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.080979 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.080990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.081014 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls podName:9f9e0f27-a5f3-44ee-9f24-aec06b0aa130 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.08100316 +0000 UTC m=+35.125036327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ngv2d" (UID: "9f9e0f27-a5f3-44ee-9f24-aec06b0aa130") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.081061 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.081095 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls podName:66aa1d26-c320-4759-bcd7-99678d388133 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.081086004 +0000 UTC m=+35.125119172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls") pod "cluster-samples-operator-667775844f-62hlh" (UID: "66aa1d26-c320-4759-bcd7-99678d388133") : secret "samples-operator-tls" not found Apr 16 13:59:38.081123 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.081109 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:38.081575 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.081141 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert podName:60e08263-d0bc-469a-b7ae-b83c965fa7a3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.081130892 +0000 UTC m=+35.125164079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6gtrd" (UID: "60e08263-d0bc-469a-b7ae-b83c965fa7a3") : secret "networking-console-plugin-cert" not found Apr 16 13:59:38.282352 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.282260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:38.283080 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.282518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:38.283080 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.282635 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.283080 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.282724 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs podName:45ca9ed1-9528-4529-8ffc-64027bd9e40a nodeName:}" failed. No retries permitted until 2026-04-16 14:00:10.282704543 +0000 UTC m=+66.326737709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs") pod "network-metrics-daemon-59j5c" (UID: "45ca9ed1-9528-4529-8ffc-64027bd9e40a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.285034 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.285008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfhm\" (UniqueName: \"kubernetes.io/projected/c63a949a-1dfc-4b9a-a5f7-ab03e43113fb-kube-api-access-ppfhm\") pod \"network-check-target-cb8c5\" (UID: \"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb\") " pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:38.423391 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.423356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:38.511995 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.511964 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 13:59:38.512171 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.511976 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:38.515026 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.515005 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:38.515156 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.515004 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-snjtl\"" Apr 16 13:59:38.515156 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.515032 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:38.916255 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.915986 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q"] Apr 16 13:59:38.919972 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.919549 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr"] Apr 16 13:59:38.922328 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.922278 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cb8c5"] Apr 16 13:59:38.924654 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.924610 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-c2xv7"] Apr 16 13:59:38.930539 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.930519 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn"] Apr 16 13:59:38.932478 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:38.932436 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ed3c5b_e875_42d2_b496_1c3aacfc5b95.slice/crio-c86d2e4bd29f344c686171888bf4cd3fc85648ae4ebc41cf70280714ed4dcb19 WatchSource:0}: Error finding container c86d2e4bd29f344c686171888bf4cd3fc85648ae4ebc41cf70280714ed4dcb19: Status 404 returned error can't find the container with id c86d2e4bd29f344c686171888bf4cd3fc85648ae4ebc41cf70280714ed4dcb19 Apr 16 13:59:38.937981 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:38.937947 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027223a7_23fc_4750_bd60_a91d4fbb3300.slice/crio-92cd12d995b1294d24d6a7a61c9bc8934d14dc2cc31482d97001ef0df6887bee WatchSource:0}: Error finding container 92cd12d995b1294d24d6a7a61c9bc8934d14dc2cc31482d97001ef0df6887bee: Status 404 returned error can't find the container with id 92cd12d995b1294d24d6a7a61c9bc8934d14dc2cc31482d97001ef0df6887bee Apr 16 13:59:38.943564 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.943540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-h9k6c"] Apr 16 13:59:38.949624 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.949522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9"] Apr 16 13:59:38.957840 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:38.957812 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0191787a_6511_4245_a250_5fe459bf077c.slice/crio-b8cad0050225697a5bb25911b3661b4990265a547bef6b1dff98fed42e43772b WatchSource:0}: Error finding container b8cad0050225697a5bb25911b3661b4990265a547bef6b1dff98fed42e43772b: Status 404 returned error can't find the container with id b8cad0050225697a5bb25911b3661b4990265a547bef6b1dff98fed42e43772b Apr 16 13:59:38.987697 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:38.987657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:38.987806 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.987779 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:38.987806 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.987791 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9dc6b545d-grd2w: secret "image-registry-tls" not found Apr 16 13:59:38.987871 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:38.987838 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls podName:d7890347-e1d0-4b03-8b89-c727fa4c4f18 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.987822683 +0000 UTC m=+37.031855849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls") pod "image-registry-9dc6b545d-grd2w" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18") : secret "image-registry-tls" not found Apr 16 13:59:38.999844 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:38.999819 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6b46fd_2e36_47ee_945a_f745461401d6.slice/crio-c30792e3822e0c99c112389f9201b5c3cd79da6db07a813223309ff2411c2882 WatchSource:0}: Error finding container c30792e3822e0c99c112389f9201b5c3cd79da6db07a813223309ff2411c2882: Status 404 returned error can't find the container with id c30792e3822e0c99c112389f9201b5c3cd79da6db07a813223309ff2411c2882 Apr 16 13:59:39.089044 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:39.089187 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:39.089187 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:39.089187 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:39.089187 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:39.089187 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089169 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089220 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089242 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls podName:2e37e261-0537-4203-b8fd-2b1189bee139 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089220906 +0000 UTC m=+37.133254087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls") pod "dns-default-8ptvc" (UID: "2e37e261-0537-4203-b8fd-2b1189bee139") : secret "dns-default-metrics-tls" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089254 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089278 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls podName:9f9e0f27-a5f3-44ee-9f24-aec06b0aa130 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089261051 +0000 UTC m=+37.133294214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ngv2d" (UID: "9f9e0f27-a5f3-44ee-9f24-aec06b0aa130") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089279 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089297 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089287791 +0000 UTC m=+37.133320958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : configmap references non-existent config key: service-ca.crt Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089311 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089313 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls podName:66aa1d26-c320-4759-bcd7-99678d388133 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089304106 +0000 UTC m=+37.133337271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls") pod "cluster-samples-operator-667775844f-62hlh" (UID: "66aa1d26-c320-4759-bcd7-99678d388133") : secret "samples-operator-tls" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089341 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert podName:60e08263-d0bc-469a-b7ae-b83c965fa7a3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089331884 +0000 UTC m=+37.133365052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6gtrd" (UID: "60e08263-d0bc-469a-b7ae-b83c965fa7a3") : secret "networking-console-plugin-cert" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089351 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert podName:a8fbe668-dcf2-4f5d-97a8-f28c95ce8261 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089345647 +0000 UTC m=+37.133378810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert") pod "ingress-canary-kqzcz" (UID: "a8fbe668-dcf2-4f5d-97a8-f28c95ce8261") : secret "canary-serving-cert" not found Apr 16 13:59:39.089422 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.089383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:39.089848 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089448 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:59:39.089848 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:39.089475 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.089468806 +0000 UTC m=+37.133501969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : secret "router-metrics-certs-default" not found Apr 16 13:59:39.660629 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.660564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cb8c5" event={"ID":"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb","Type":"ContainerStarted","Data":"680f476685f8e7d5836a16a1099ec571f584ed91766affd038dd7d464a7bc520"} Apr 16 13:59:39.662387 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.662310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" event={"ID":"8f6b46fd-2e36-47ee-945a-f745461401d6","Type":"ContainerStarted","Data":"c30792e3822e0c99c112389f9201b5c3cd79da6db07a813223309ff2411c2882"} Apr 16 13:59:39.664900 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.664839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" event={"ID":"51685b01-c9b4-47f8-98e7-5e19c0d32502","Type":"ContainerStarted","Data":"e7c4cec84b4622b67ae978e5d0f38c76eb4b0d8aa74cab5282b78d1e483b034b"} Apr 16 13:59:39.666298 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.666247 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" event={"ID":"0191787a-6511-4245-a250-5fe459bf077c","Type":"ContainerStarted","Data":"b8cad0050225697a5bb25911b3661b4990265a547bef6b1dff98fed42e43772b"} Apr 16 13:59:39.669660 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.669600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" event={"ID":"943fa4f4-07e5-4de7-9313-95a7a017b304","Type":"ContainerStarted","Data":"66adc320c3a8d7e86cddb250acbc90c7486b0963b8d122bdeb3ce7018b65e4e1"} Apr 16 13:59:39.671764 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.671706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" event={"ID":"26ed3c5b-e875-42d2-b496-1c3aacfc5b95","Type":"ContainerStarted","Data":"c86d2e4bd29f344c686171888bf4cd3fc85648ae4ebc41cf70280714ed4dcb19"} Apr 16 13:59:39.673370 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.673310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" event={"ID":"027223a7-23fc-4750-bd60-a91d4fbb3300","Type":"ContainerStarted","Data":"92cd12d995b1294d24d6a7a61c9bc8934d14dc2cc31482d97001ef0df6887bee"} Apr 16 13:59:39.678591 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.677905 2573 generic.go:358] "Generic (PLEG): container finished" podID="35776b6a-fb9e-462b-9392-ed0451ab2515" containerID="aa48fdf89f4fdc0098c523e53da3278c66856aba9e905c1f27f647907fe5f5b4" exitCode=0 Apr 16 13:59:39.678591 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:39.677946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerDied","Data":"aa48fdf89f4fdc0098c523e53da3278c66856aba9e905c1f27f647907fe5f5b4"} Apr 16 13:59:40.692897 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:40.691873 2573 generic.go:358] "Generic (PLEG): container finished" podID="35776b6a-fb9e-462b-9392-ed0451ab2515" containerID="d9fd3932360e1f7e98058e51991b89881e5dd87f6f1e5ec491321a120e1a8fc2" exitCode=0 Apr 16 13:59:40.692897 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:40.691930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerDied","Data":"d9fd3932360e1f7e98058e51991b89881e5dd87f6f1e5ec491321a120e1a8fc2"} Apr 16 13:59:41.008243 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.008134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:41.008451 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.008356 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:41.008451 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.008445 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9dc6b545d-grd2w: secret "image-registry-tls" not found Apr 16 13:59:41.008579 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.008511 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls podName:d7890347-e1d0-4b03-8b89-c727fa4c4f18 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.008491402 +0000 UTC m=+41.052524571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls") pod "image-registry-9dc6b545d-grd2w" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18") : secret "image-registry-tls" not found Apr 16 13:59:41.108926 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.108843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:41.108926 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.108911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:41.109158 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.108955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:41.109158 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.108986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:41.109158 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.109021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:41.109158 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.109069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:41.109158 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:41.109113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:41.109397 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109280 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.109259584 +0000 UTC m=+41.153292755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : configmap references non-existent config key: service-ca.crt Apr 16 13:59:41.109743 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109702 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:59:41.109826 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109762 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.109742879 +0000 UTC m=+41.153776049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : secret "router-metrics-certs-default" not found Apr 16 13:59:41.109826 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109824 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:41.109939 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109851 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls podName:2e37e261-0537-4203-b8fd-2b1189bee139 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.109841862 +0000 UTC m=+41.153875031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls") pod "dns-default-8ptvc" (UID: "2e37e261-0537-4203-b8fd-2b1189bee139") : secret "dns-default-metrics-tls" not found Apr 16 13:59:41.109939 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109900 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:41.109939 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109930 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls podName:9f9e0f27-a5f3-44ee-9f24-aec06b0aa130 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.10992021 +0000 UTC m=+41.153953374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ngv2d" (UID: "9f9e0f27-a5f3-44ee-9f24-aec06b0aa130") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:41.110096 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.109977 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:41.110096 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.110008 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert podName:60e08263-d0bc-469a-b7ae-b83c965fa7a3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.109996957 +0000 UTC m=+41.154030127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6gtrd" (UID: "60e08263-d0bc-469a-b7ae-b83c965fa7a3") : secret "networking-console-plugin-cert" not found Apr 16 13:59:41.110096 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.110066 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:41.110096 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.110094 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls podName:66aa1d26-c320-4759-bcd7-99678d388133 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.110085118 +0000 UTC m=+41.154118286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls") pod "cluster-samples-operator-667775844f-62hlh" (UID: "66aa1d26-c320-4759-bcd7-99678d388133") : secret "samples-operator-tls" not found Apr 16 13:59:41.110278 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.110137 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:41.110278 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:41.110163 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert podName:a8fbe668-dcf2-4f5d-97a8-f28c95ce8261 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:45.110154966 +0000 UTC m=+41.154188134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert") pod "ingress-canary-kqzcz" (UID: "a8fbe668-dcf2-4f5d-97a8-f28c95ce8261") : secret "canary-serving-cert" not found Apr 16 13:59:45.048559 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.048331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:45.049115 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.048483 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:45.049115 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.048650 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9dc6b545d-grd2w: secret "image-registry-tls" not found Apr 16 13:59:45.049115 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.048748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls podName:d7890347-e1d0-4b03-8b89-c727fa4c4f18 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.048729062 +0000 UTC m=+49.092762228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls") pod "image-registry-9dc6b545d-grd2w" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18") : secret "image-registry-tls" not found Apr 16 13:59:45.150014 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.149971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:45.150200 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.150029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:45.150200 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.150091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:45.150200 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.150130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:45.150200 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150139 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:45.150200 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.150153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:45.150200 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls podName:2e37e261-0537-4203-b8fd-2b1189bee139 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.150182314 +0000 UTC m=+49.194215492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls") pod "dns-default-8ptvc" (UID: "2e37e261-0537-4203-b8fd-2b1189bee139") : secret "dns-default-metrics-tls" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150210 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150230 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150240 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150264 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls podName:66aa1d26-c320-4759-bcd7-99678d388133 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.150253579 +0000 UTC m=+49.194286743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls") pod "cluster-samples-operator-667775844f-62hlh" (UID: "66aa1d26-c320-4759-bcd7-99678d388133") : secret "samples-operator-tls" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150270 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150276 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.150270269 +0000 UTC m=+49.194303432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : secret "router-metrics-certs-default" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.150374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150414 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150427 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls podName:9f9e0f27-a5f3-44ee-9f24-aec06b0aa130 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.15041272 +0000 UTC m=+49.194445885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ngv2d" (UID: "9f9e0f27-a5f3-44ee-9f24-aec06b0aa130") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150449 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert podName:60e08263-d0bc-469a-b7ae-b83c965fa7a3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.15043932 +0000 UTC m=+49.194472486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6gtrd" (UID: "60e08263-d0bc-469a-b7ae-b83c965fa7a3") : secret "networking-console-plugin-cert" not found Apr 16 13:59:45.150508 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:45.150476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:45.150910 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150528 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.150514949 +0000 UTC m=+49.194548126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : configmap references non-existent config key: service-ca.crt Apr 16 13:59:45.150910 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:45.150589 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert podName:a8fbe668-dcf2-4f5d-97a8-f28c95ce8261 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.15057927 +0000 UTC m=+49.194612438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert") pod "ingress-canary-kqzcz" (UID: "a8fbe668-dcf2-4f5d-97a8-f28c95ce8261") : secret "canary-serving-cert" not found Apr 16 13:59:46.706802 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.706764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" event={"ID":"51685b01-c9b4-47f8-98e7-5e19c0d32502","Type":"ContainerStarted","Data":"5b51ceb4f27792cb85e28bc4bb64e7ac8069814a16baab9a2e4e33827d781056"} Apr 16 13:59:46.708318 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.708291 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/0.log" Apr 16 13:59:46.708456 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.708336 2573 generic.go:358] "Generic (PLEG): container finished" podID="0191787a-6511-4245-a250-5fe459bf077c" containerID="59a932db0a01778ad09c40cafa253e20c636e5b58afaa67a60681a4be5e25469" exitCode=255 Apr 16 13:59:46.708456 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.708402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" event={"ID":"0191787a-6511-4245-a250-5fe459bf077c","Type":"ContainerDied","Data":"59a932db0a01778ad09c40cafa253e20c636e5b58afaa67a60681a4be5e25469"} Apr 16 13:59:46.708657 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.708639 2573 scope.go:117] "RemoveContainer" containerID="59a932db0a01778ad09c40cafa253e20c636e5b58afaa67a60681a4be5e25469" Apr 16 13:59:46.710800 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.710777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" event={"ID":"943fa4f4-07e5-4de7-9313-95a7a017b304","Type":"ContainerStarted","Data":"5b977e9ef458e8a417e54359125a8f93e7a763420777c8f2a318bae41a3ee3d2"} Apr 16 13:59:46.712619 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.712585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" event={"ID":"26ed3c5b-e875-42d2-b496-1c3aacfc5b95","Type":"ContainerStarted","Data":"19bc4f1445ba644ced630ecaf5477b4e892954da30ef764dcf53b56b25dc17e3"} Apr 16 13:59:46.713849 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.713814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" event={"ID":"027223a7-23fc-4750-bd60-a91d4fbb3300","Type":"ContainerStarted","Data":"0dcd71df7169eb27df415a60e5dd2a4a00a3389c4f5e8c0542c06138d787c75e"} Apr 16 13:59:46.717559 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.717537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-twv7g" event={"ID":"35776b6a-fb9e-462b-9392-ed0451ab2515","Type":"ContainerStarted","Data":"a36e6cb79d37e87d21c890c190279e1b0dd6addf353e2e230c28f96fbdc10b76"} Apr 16 13:59:46.719370 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.719339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cb8c5" event={"ID":"c63a949a-1dfc-4b9a-a5f7-ab03e43113fb","Type":"ContainerStarted","Data":"36259d7d8e232c022b42b75b79f99bd851c79cd8c69ccb791fae866e4e05695c"} Apr 16 13:59:46.719560 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.719546 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 13:59:46.720690 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.720660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" event={"ID":"8f6b46fd-2e36-47ee-945a-f745461401d6","Type":"ContainerStarted","Data":"64bff32d5615684a6aa9f5b42cbe05409585f3584a7fc49bf42ecb871cba3f25"} Apr 16 13:59:46.725134 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.725088 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-55pdr" podStartSLOduration=29.467884727 podStartE2EDuration="36.725074032s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:38.931923756 +0000 UTC m=+34.975956953" lastFinishedPulling="2026-04-16 13:59:46.189113095 +0000 UTC m=+42.233146258" observedRunningTime="2026-04-16 13:59:46.723915532 +0000 UTC m=+42.767948720" watchObservedRunningTime="2026-04-16 13:59:46.725074032 +0000 UTC m=+42.769107219" Apr 16 13:59:46.745088 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.745021 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cb8c5" podStartSLOduration=35.478004833 podStartE2EDuration="42.745004649s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:38.931928449 +0000 UTC m=+34.975961623" lastFinishedPulling="2026-04-16 13:59:46.198928275 +0000 UTC m=+42.242961439" observedRunningTime="2026-04-16 13:59:46.742092027 +0000 UTC m=+42.786125215" watchObservedRunningTime="2026-04-16 13:59:46.745004649 +0000 UTC m=+42.789037834" Apr 16 13:59:46.764261 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.764223 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-twv7g" podStartSLOduration=10.970247065 podStartE2EDuration="42.764209573s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 13:59:07.055170049 +0000 UTC m=+3.099203213" lastFinishedPulling="2026-04-16 13:59:38.849132531 +0000 UTC m=+34.893165721" observedRunningTime="2026-04-16 13:59:46.762986836 +0000 UTC m=+42.807020023" watchObservedRunningTime="2026-04-16 13:59:46.764209573 +0000 UTC m=+42.808242759" Apr 16 13:59:46.779165 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.779118 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gm66q" podStartSLOduration=29.523971421 podStartE2EDuration="36.779099262s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:38.933417216 +0000 UTC m=+34.977450388" lastFinishedPulling="2026-04-16 13:59:46.188545065 +0000 UTC m=+42.232578229" observedRunningTime="2026-04-16 13:59:46.778525661 +0000 UTC m=+42.822558844" watchObservedRunningTime="2026-04-16 13:59:46.779099262 +0000 UTC m=+42.823132447" Apr 16 13:59:46.798638 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.798587 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" podStartSLOduration=29.543728353 podStartE2EDuration="36.798571412s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:38.934460437 +0000 UTC m=+34.978493616" lastFinishedPulling="2026-04-16 13:59:46.189303496 +0000 UTC m=+42.233336675" observedRunningTime="2026-04-16 13:59:46.797927238 +0000 UTC m=+42.841960426" watchObservedRunningTime="2026-04-16 13:59:46.798571412 +0000 UTC m=+42.842604599" Apr 16 13:59:46.819157 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.818010 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" podStartSLOduration=29.56742057 podStartE2EDuration="36.817992442s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:38.938766546 +0000 UTC m=+34.982799724" lastFinishedPulling="2026-04-16 13:59:46.189338429 +0000 UTC m=+42.233371596" observedRunningTime="2026-04-16 13:59:46.816011009 +0000 UTC m=+42.860044196" watchObservedRunningTime="2026-04-16 13:59:46.817992442 +0000 UTC m=+42.862025630" Apr 16 13:59:46.839553 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:46.838965 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" podStartSLOduration=29.721558944999998 podStartE2EDuration="36.838946504s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:39.001701975 +0000 UTC m=+35.045735139" lastFinishedPulling="2026-04-16 13:59:46.119089519 +0000 UTC m=+42.163122698" observedRunningTime="2026-04-16 13:59:46.837285505 +0000 UTC m=+42.881318694" watchObservedRunningTime="2026-04-16 13:59:46.838946504 +0000 UTC m=+42.882979692" Apr 16 13:59:47.614224 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.614188 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:47.614224 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.614227 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:47.725633 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.725600 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 13:59:47.726106 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.726089 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/0.log" Apr 16 13:59:47.726165 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.726126 2573 generic.go:358] "Generic (PLEG): container finished" podID="0191787a-6511-4245-a250-5fe459bf077c" containerID="6b865d4e62f50bc38279b67454733b7892a26377a21161d75b06ff39d737b9d7" exitCode=255 Apr 16 13:59:47.726270 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.726244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" event={"ID":"0191787a-6511-4245-a250-5fe459bf077c","Type":"ContainerDied","Data":"6b865d4e62f50bc38279b67454733b7892a26377a21161d75b06ff39d737b9d7"} Apr 16 13:59:47.726333 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.726293 2573 scope.go:117] "RemoveContainer" containerID="59a932db0a01778ad09c40cafa253e20c636e5b58afaa67a60681a4be5e25469" Apr 16 13:59:47.726918 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.726473 2573 scope.go:117] "RemoveContainer" containerID="6b865d4e62f50bc38279b67454733b7892a26377a21161d75b06ff39d737b9d7" Apr 16 13:59:47.726918 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:47.726711 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-h9k6c_openshift-console-operator(0191787a-6511-4245-a250-5fe459bf077c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" podUID="0191787a-6511-4245-a250-5fe459bf077c" Apr 16 13:59:47.765236 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.765212 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2"] Apr 16 13:59:47.790245 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.790217 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2"] Apr 16 13:59:47.790375 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.790329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" Apr 16 13:59:47.792780 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.792755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:47.792887 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.792755 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-rwp5c\"" Apr 16 13:59:47.792887 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.792802 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 13:59:47.877693 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.877583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz5r7\" (UniqueName: \"kubernetes.io/projected/2513a90f-fe9e-479f-988c-ff9994dfbf16-kube-api-access-rz5r7\") pod \"migrator-64d4d94569-7twj2\" (UID: \"2513a90f-fe9e-479f-988c-ff9994dfbf16\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" Apr 16 13:59:47.978151 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.978111 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz5r7\" (UniqueName: \"kubernetes.io/projected/2513a90f-fe9e-479f-988c-ff9994dfbf16-kube-api-access-rz5r7\") pod \"migrator-64d4d94569-7twj2\" (UID: \"2513a90f-fe9e-479f-988c-ff9994dfbf16\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" Apr 16 13:59:47.990147 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:47.990117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz5r7\" (UniqueName: \"kubernetes.io/projected/2513a90f-fe9e-479f-988c-ff9994dfbf16-kube-api-access-rz5r7\") pod \"migrator-64d4d94569-7twj2\" (UID: \"2513a90f-fe9e-479f-988c-ff9994dfbf16\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" Apr 16 13:59:48.100063 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:48.100017 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" Apr 16 13:59:48.471823 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:48.471746 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2"] Apr 16 13:59:48.474780 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:48.474753 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2513a90f_fe9e_479f_988c_ff9994dfbf16.slice/crio-befe143e376bc8c289a712fd6b8bac4e2e46be03dc47bb47347489e2dd61f4c6 WatchSource:0}: Error finding container befe143e376bc8c289a712fd6b8bac4e2e46be03dc47bb47347489e2dd61f4c6: Status 404 returned error can't find the container with id befe143e376bc8c289a712fd6b8bac4e2e46be03dc47bb47347489e2dd61f4c6 Apr 16 13:59:48.729736 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:48.729653 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 13:59:48.730184 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:48.730088 2573 scope.go:117] "RemoveContainer" containerID="6b865d4e62f50bc38279b67454733b7892a26377a21161d75b06ff39d737b9d7" Apr 16 13:59:48.730333 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:48.730311 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-h9k6c_openshift-console-operator(0191787a-6511-4245-a250-5fe459bf077c)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" podUID="0191787a-6511-4245-a250-5fe459bf077c" Apr 16 13:59:48.730771 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:48.730750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" event={"ID":"2513a90f-fe9e-479f-988c-ff9994dfbf16","Type":"ContainerStarted","Data":"befe143e376bc8c289a712fd6b8bac4e2e46be03dc47bb47347489e2dd61f4c6"} Apr 16 13:59:49.949987 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.949953 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-dt4cm"] Apr 16 13:59:49.953550 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.953524 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:49.956015 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.955994 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 13:59:49.956167 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.956048 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 13:59:49.956167 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.956064 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 13:59:49.957291 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.957272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-2lfsq\"" Apr 16 13:59:49.957430 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.957317 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 13:59:49.961614 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.961572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-dt4cm"] Apr 16 13:59:49.998032 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.998000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-signing-key\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:49.998198 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.998179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-signing-cabundle\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:49.998292 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:49.998274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr27m\" (UniqueName: \"kubernetes.io/projected/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-kube-api-access-dr27m\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.099077 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.099044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-signing-cabundle\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.099256 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.099130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr27m\" (UniqueName: \"kubernetes.io/projected/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-kube-api-access-dr27m\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.099256 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.099172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-signing-key\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.099896 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.099866 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-signing-cabundle\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.101897 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.101873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-signing-key\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.109773 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.109748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr27m\" (UniqueName: \"kubernetes.io/projected/375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7-kube-api-access-dr27m\") pod \"service-ca-bfc587fb7-dt4cm\" (UID: \"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7\") " pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.265550 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.265480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" Apr 16 13:59:50.392836 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.392810 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-dt4cm"] Apr 16 13:59:50.395575 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:50.395546 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375d8f75_c0fe_4fc8_bdeb_d55ad6d95ae7.slice/crio-446400551d7cae1c9a8a57a4a394e230257cd827c6d0db014b7e862721304d9a WatchSource:0}: Error finding container 446400551d7cae1c9a8a57a4a394e230257cd827c6d0db014b7e862721304d9a: Status 404 returned error can't find the container with id 446400551d7cae1c9a8a57a4a394e230257cd827c6d0db014b7e862721304d9a Apr 16 13:59:50.746555 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.746517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" event={"ID":"2513a90f-fe9e-479f-988c-ff9994dfbf16","Type":"ContainerStarted","Data":"6dd7d9954d9a5a45a09c06bc3ab91baa634ee60dad16e4900280d35e29ebc998"} Apr 16 13:59:50.746555 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.746561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" event={"ID":"2513a90f-fe9e-479f-988c-ff9994dfbf16","Type":"ContainerStarted","Data":"f21dab0872d219d4fc6677105ee83730a1073af2b6f3072960b68f5bf87210f8"} Apr 16 13:59:50.747875 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.747852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" event={"ID":"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7","Type":"ContainerStarted","Data":"04e4d3d976faf5422777302eada1fbe150148ef3e11fdfb05a7dc37bbfc1d886"} Apr 16 13:59:50.747948 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.747880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" event={"ID":"375d8f75-c0fe-4fc8-bdeb-d55ad6d95ae7","Type":"ContainerStarted","Data":"446400551d7cae1c9a8a57a4a394e230257cd827c6d0db014b7e862721304d9a"} Apr 16 13:59:50.766060 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:50.766016 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7twj2" podStartSLOduration=1.966414374 podStartE2EDuration="3.766003846s" podCreationTimestamp="2026-04-16 13:59:47 +0000 UTC" firstStartedPulling="2026-04-16 13:59:48.476956722 +0000 UTC m=+44.520989889" lastFinishedPulling="2026-04-16 13:59:50.276546189 +0000 UTC m=+46.320579361" observedRunningTime="2026-04-16 13:59:50.76459644 +0000 UTC m=+46.808629638" watchObservedRunningTime="2026-04-16 13:59:50.766003846 +0000 UTC m=+46.810037032" Apr 16 13:59:51.784803 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:51.784770 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rh5gj_ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d/dns-node-resolver/0.log" Apr 16 13:59:52.783118 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:52.783088 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kh2z9_f4089fed-dbf0-4e49-a815-79b762aba862/node-ca/0.log" Apr 16 13:59:53.126416 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.126377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 13:59:53.126847 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.126535 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:53.126847 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.126554 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9dc6b545d-grd2w: secret "image-registry-tls" not found Apr 16 13:59:53.126847 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.126645 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls podName:d7890347-e1d0-4b03-8b89-c727fa4c4f18 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.126629165 +0000 UTC m=+65.170662329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls") pod "image-registry-9dc6b545d-grd2w" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18") : secret "image-registry-tls" not found Apr 16 13:59:53.226846 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.226814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 13:59:53.226999 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.226856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 13:59:53.226999 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.226879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 13:59:53.226999 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.226908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:53.226999 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.226943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 13:59:53.226999 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.226953 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:53.226999 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.226997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227022 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls podName:9f9e0f27-a5f3-44ee-9f24-aec06b0aa130 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.227007491 +0000 UTC m=+65.271040655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ngv2d" (UID: "9f9e0f27-a5f3-44ee-9f24-aec06b0aa130") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227031 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.226955 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227082 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert podName:60e08263-d0bc-469a-b7ae-b83c965fa7a3 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.227065982 +0000 UTC m=+65.271099151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6gtrd" (UID: "60e08263-d0bc-469a-b7ae-b83c965fa7a3") : secret "networking-console-plugin-cert" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227098 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.227090299 +0000 UTC m=+65.271123463 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : configmap references non-existent config key: service-ca.crt Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227106 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.227145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227158 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls podName:2e37e261-0537-4203-b8fd-2b1189bee139 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.227135301 +0000 UTC m=+65.271168475 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls") pod "dns-default-8ptvc" (UID: "2e37e261-0537-4203-b8fd-2b1189bee139") : secret "dns-default-metrics-tls" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert podName:a8fbe668-dcf2-4f5d-97a8-f28c95ce8261 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.227184138 +0000 UTC m=+65.271217311 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert") pod "ingress-canary-kqzcz" (UID: "a8fbe668-dcf2-4f5d-97a8-f28c95ce8261") : secret "canary-serving-cert" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227203 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:59:53.227296 ip-10-0-130-195 kubenswrapper[2573]: E0416 13:59:53.227235 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs podName:fc51c8a5-99bf-42f4-9371-6d41a4f0fc91 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.227224147 +0000 UTC m=+65.271257313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs") pod "router-default-56b5bd7874-fvchd" (UID: "fc51c8a5-99bf-42f4-9371-6d41a4f0fc91") : secret "router-metrics-certs-default" not found Apr 16 13:59:53.229501 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.229470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/66aa1d26-c320-4759-bcd7-99678d388133-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-62hlh\" (UID: \"66aa1d26-c320-4759-bcd7-99678d388133\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:53.521773 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.521695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" Apr 16 13:59:53.529928 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.529900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:53.532549 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.532528 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0c5b0215-6e48-4512-a0fa-d432021b128c-original-pull-secret\") pod \"global-pull-secret-syncer-l2l88\" (UID: \"0c5b0215-6e48-4512-a0fa-d432021b128c\") " pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:53.538535 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.538515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l2l88" Apr 16 13:59:53.586800 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.586719 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7twj2_2513a90f-fe9e-479f-988c-ff9994dfbf16/migrator/0.log" Apr 16 13:59:53.659982 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.659931 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-dt4cm" podStartSLOduration=4.659910676 podStartE2EDuration="4.659910676s" podCreationTimestamp="2026-04-16 13:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:50.78111411 +0000 UTC m=+46.825147319" watchObservedRunningTime="2026-04-16 13:59:53.659910676 +0000 UTC m=+49.703943863" Apr 16 13:59:53.660840 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.660802 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh"] Apr 16 13:59:53.676381 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.676357 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l2l88"] Apr 16 13:59:53.679300 ip-10-0-130-195 kubenswrapper[2573]: W0416 13:59:53.679267 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5b0215_6e48_4512_a0fa_d432021b128c.slice/crio-3e2e1ed15ff7885f902706bfd40df8df71eb2946be084272a0f0b9d1f4033a63 WatchSource:0}: Error finding container 3e2e1ed15ff7885f902706bfd40df8df71eb2946be084272a0f0b9d1f4033a63: Status 404 returned error can't find the container with id 3e2e1ed15ff7885f902706bfd40df8df71eb2946be084272a0f0b9d1f4033a63 Apr 16 13:59:53.757385 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.757352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l2l88" event={"ID":"0c5b0215-6e48-4512-a0fa-d432021b128c","Type":"ContainerStarted","Data":"3e2e1ed15ff7885f902706bfd40df8df71eb2946be084272a0f0b9d1f4033a63"} Apr 16 13:59:53.758284 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.758260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" event={"ID":"66aa1d26-c320-4759-bcd7-99678d388133","Type":"ContainerStarted","Data":"7637fec30001206c244bac6c575fa0318184b5341423962ee1b35ab760ace6a8"} Apr 16 13:59:53.784275 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.784210 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7twj2_2513a90f-fe9e-479f-988c-ff9994dfbf16/graceful-termination/0.log" Apr 16 13:59:53.986578 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:53.986550 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-845gn_027223a7-23fc-4750-bd60-a91d4fbb3300/kube-storage-version-migrator-operator/0.log" Apr 16 13:59:57.614235 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:57.614195 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:57.614235 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:57.614240 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:57.614837 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:57.614765 2573 scope.go:117] "RemoveContainer" containerID="6b865d4e62f50bc38279b67454733b7892a26377a21161d75b06ff39d737b9d7" Apr 16 13:59:58.774491 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.774461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 13:59:58.774961 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.774543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" event={"ID":"0191787a-6511-4245-a250-5fe459bf077c","Type":"ContainerStarted","Data":"ba5c07d5e06a8516bf01fe0a6e4314f2abe3754e8e92b2303b490370313e4c90"} Apr 16 13:59:58.774961 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.774907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 13:59:58.776429 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.776403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" event={"ID":"66aa1d26-c320-4759-bcd7-99678d388133","Type":"ContainerStarted","Data":"3941d39b9d94a032dd3e483b7f771cb44453438f2184a3402f2bfc248255a7da"} Apr 16 13:59:58.776608 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.776583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" event={"ID":"66aa1d26-c320-4759-bcd7-99678d388133","Type":"ContainerStarted","Data":"94a0b4e7e1ed5dc7d259716bbce9d3868300e5cabaee605843e606186fc108e4"} Apr 16 13:59:58.777659 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.777638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l2l88" event={"ID":"0c5b0215-6e48-4512-a0fa-d432021b128c","Type":"ContainerStarted","Data":"b06115325a8a8e7f679273e2779d72f581682768d062391925e62ca47219c8f0"} Apr 16 13:59:58.791410 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.791371 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" podStartSLOduration=41.562401193 podStartE2EDuration="48.791360843s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:38.959949959 +0000 UTC m=+35.003983123" lastFinishedPulling="2026-04-16 13:59:46.188909595 +0000 UTC m=+42.232942773" observedRunningTime="2026-04-16 13:59:58.790933215 +0000 UTC m=+54.834966401" watchObservedRunningTime="2026-04-16 13:59:58.791360843 +0000 UTC m=+54.835394026" Apr 16 13:59:58.805740 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.805689 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-62hlh" podStartSLOduration=44.303785994 podStartE2EDuration="48.80565732s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 13:59:53.699846854 +0000 UTC m=+49.743880022" lastFinishedPulling="2026-04-16 13:59:58.201718176 +0000 UTC m=+54.245751348" observedRunningTime="2026-04-16 13:59:58.805012118 +0000 UTC m=+54.849045337" watchObservedRunningTime="2026-04-16 13:59:58.80565732 +0000 UTC m=+54.849690506" Apr 16 13:59:58.818426 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:58.818388 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-l2l88" podStartSLOduration=33.283931879 podStartE2EDuration="37.818376521s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:53.681030113 +0000 UTC m=+49.725063277" lastFinishedPulling="2026-04-16 13:59:58.215474754 +0000 UTC m=+54.259507919" observedRunningTime="2026-04-16 13:59:58.818050967 +0000 UTC m=+54.862084154" watchObservedRunningTime="2026-04-16 13:59:58.818376521 +0000 UTC m=+54.862409707" Apr 16 13:59:59.075124 ip-10-0-130-195 kubenswrapper[2573]: I0416 13:59:59.075093 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-h9k6c" Apr 16 14:00:03.657552 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:03.657525 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kftpl" Apr 16 14:00:09.179554 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.179516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:00:09.182023 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.181992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"image-registry-9dc6b545d-grd2w\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:00:09.280643 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.280603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 14:00:09.280849 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.280686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 14:00:09.280849 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.280721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 14:00:09.280849 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.280766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 14:00:09.280849 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.280808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:09.281093 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.280873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:09.281541 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.281510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-service-ca-bundle\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:09.283237 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.283212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e37e261-0537-4203-b8fd-2b1189bee139-metrics-tls\") pod \"dns-default-8ptvc\" (UID: \"2e37e261-0537-4203-b8fd-2b1189bee139\") " pod="openshift-dns/dns-default-8ptvc" Apr 16 14:00:09.283713 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.283693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc51c8a5-99bf-42f4-9371-6d41a4f0fc91-metrics-certs\") pod \"router-default-56b5bd7874-fvchd\" (UID: \"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91\") " pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:09.283786 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.283759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fbe668-dcf2-4f5d-97a8-f28c95ce8261-cert\") pod \"ingress-canary-kqzcz\" (UID: \"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261\") " pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 14:00:09.283786 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.283769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/60e08263-d0bc-469a-b7ae-b83c965fa7a3-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6gtrd\" (UID: \"60e08263-d0bc-469a-b7ae-b83c965fa7a3\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 14:00:09.283862 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.283759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f9e0f27-a5f3-44ee-9f24-aec06b0aa130-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ngv2d\" (UID: \"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 14:00:09.386058 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.386028 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hnvqt\"" Apr 16 14:00:09.393251 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.393225 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:00:09.440078 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.439845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-x5tv7\"" Apr 16 14:00:09.448038 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.448003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" Apr 16 14:00:09.460526 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.460507 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rtnsm\"" Apr 16 14:00:09.469277 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.468990 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:09.476131 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.476065 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mrpkg\"" Apr 16 14:00:09.483906 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.483877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" Apr 16 14:00:09.491588 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.491378 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gjmb2\"" Apr 16 14:00:09.498871 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.498848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqzcz" Apr 16 14:00:09.501862 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.501826 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-pfkw7\"" Apr 16 14:00:09.509022 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.508903 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8ptvc" Apr 16 14:00:09.525972 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.525924 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9dc6b545d-grd2w"] Apr 16 14:00:09.530009 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:09.529950 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7890347_e1d0_4b03_8b89_c727fa4c4f18.slice/crio-f44fb8a30a599d0fde5a497e003c1c7829be73342ca31ee78548280b71ef920b WatchSource:0}: Error finding container f44fb8a30a599d0fde5a497e003c1c7829be73342ca31ee78548280b71ef920b: Status 404 returned error can't find the container with id f44fb8a30a599d0fde5a497e003c1c7829be73342ca31ee78548280b71ef920b Apr 16 14:00:09.639650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.638759 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d"] Apr 16 14:00:09.641846 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:09.641817 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f9e0f27_a5f3_44ee_9f24_aec06b0aa130.slice/crio-654a58984185957d7ab8322b40bbfbc511b5632bc0c26ea9f85a524f4d820da7 WatchSource:0}: Error finding container 654a58984185957d7ab8322b40bbfbc511b5632bc0c26ea9f85a524f4d820da7: Status 404 returned error can't find the container with id 654a58984185957d7ab8322b40bbfbc511b5632bc0c26ea9f85a524f4d820da7 Apr 16 14:00:09.701702 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.701654 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kqzcz"] Apr 16 14:00:09.724404 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:09.724372 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fbe668_dcf2_4f5d_97a8_f28c95ce8261.slice/crio-96c46ce35456a3cd870336e65791ae8e019345b595bb590665c75db54607b678 WatchSource:0}: Error finding container 96c46ce35456a3cd870336e65791ae8e019345b595bb590665c75db54607b678: Status 404 returned error can't find the container with id 96c46ce35456a3cd870336e65791ae8e019345b595bb590665c75db54607b678 Apr 16 14:00:09.808206 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.808168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" event={"ID":"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130","Type":"ContainerStarted","Data":"654a58984185957d7ab8322b40bbfbc511b5632bc0c26ea9f85a524f4d820da7"} Apr 16 14:00:09.809522 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.809494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" event={"ID":"d7890347-e1d0-4b03-8b89-c727fa4c4f18","Type":"ContainerStarted","Data":"694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de"} Apr 16 14:00:09.809951 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.809529 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" event={"ID":"d7890347-e1d0-4b03-8b89-c727fa4c4f18","Type":"ContainerStarted","Data":"f44fb8a30a599d0fde5a497e003c1c7829be73342ca31ee78548280b71ef920b"} Apr 16 14:00:09.809951 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.809548 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:00:09.810500 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.810471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kqzcz" event={"ID":"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261","Type":"ContainerStarted","Data":"96c46ce35456a3cd870336e65791ae8e019345b595bb590665c75db54607b678"} Apr 16 14:00:09.827453 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.827410 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" podStartSLOduration=64.827396925 podStartE2EDuration="1m4.827396925s" podCreationTimestamp="2026-04-16 13:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:09.826515921 +0000 UTC m=+65.870549108" watchObservedRunningTime="2026-04-16 14:00:09.827396925 +0000 UTC m=+65.871430110" Apr 16 14:00:09.862876 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.862833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56b5bd7874-fvchd"] Apr 16 14:00:09.864594 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:09.864565 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc51c8a5_99bf_42f4_9371_6d41a4f0fc91.slice/crio-a67d908cd48c2f9cbe9617cde2d069fb02638887dcff237533bf28379becb227 WatchSource:0}: Error finding container a67d908cd48c2f9cbe9617cde2d069fb02638887dcff237533bf28379becb227: Status 404 returned error can't find the container with id a67d908cd48c2f9cbe9617cde2d069fb02638887dcff237533bf28379becb227 Apr 16 14:00:09.924855 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.924826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd"] Apr 16 14:00:09.927526 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:09.927504 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8ptvc"] Apr 16 14:00:09.929209 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:09.929183 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e08263_d0bc_469a_b7ae_b83c965fa7a3.slice/crio-258edd9c30fe1dbe17891d053072369d46f939be7353e6ce2e3e9d38fc056102 WatchSource:0}: Error finding container 258edd9c30fe1dbe17891d053072369d46f939be7353e6ce2e3e9d38fc056102: Status 404 returned error can't find the container with id 258edd9c30fe1dbe17891d053072369d46f939be7353e6ce2e3e9d38fc056102 Apr 16 14:00:09.933744 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:09.933718 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e37e261_0537_4203_b8fd_2b1189bee139.slice/crio-2935f229b4fbb5250fb730987c8fef72438f26270a11968e968c09d3073d4f46 WatchSource:0}: Error finding container 2935f229b4fbb5250fb730987c8fef72438f26270a11968e968c09d3073d4f46: Status 404 returned error can't find the container with id 2935f229b4fbb5250fb730987c8fef72438f26270a11968e968c09d3073d4f46 Apr 16 14:00:10.294513 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.294476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 14:00:10.297243 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.297224 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:10.307592 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.307567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45ca9ed1-9528-4529-8ffc-64027bd9e40a-metrics-certs\") pod \"network-metrics-daemon-59j5c\" (UID: \"45ca9ed1-9528-4529-8ffc-64027bd9e40a\") " pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 14:00:10.325967 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.325940 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-snjtl\"" Apr 16 14:00:10.334042 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.334026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59j5c" Apr 16 14:00:10.452725 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.452695 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-59j5c"] Apr 16 14:00:10.455299 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:10.455263 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ca9ed1_9528_4529_8ffc_64027bd9e40a.slice/crio-3b761e5cbc01156118b2cde36d5a67a3822c1d3c11bf1bde4c662016544c631e WatchSource:0}: Error finding container 3b761e5cbc01156118b2cde36d5a67a3822c1d3c11bf1bde4c662016544c631e: Status 404 returned error can't find the container with id 3b761e5cbc01156118b2cde36d5a67a3822c1d3c11bf1bde4c662016544c631e Apr 16 14:00:10.815462 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.815409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8ptvc" event={"ID":"2e37e261-0537-4203-b8fd-2b1189bee139","Type":"ContainerStarted","Data":"2935f229b4fbb5250fb730987c8fef72438f26270a11968e968c09d3073d4f46"} Apr 16 14:00:10.817242 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.817165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-59j5c" event={"ID":"45ca9ed1-9528-4529-8ffc-64027bd9e40a","Type":"ContainerStarted","Data":"3b761e5cbc01156118b2cde36d5a67a3822c1d3c11bf1bde4c662016544c631e"} Apr 16 14:00:10.819344 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.819303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" event={"ID":"60e08263-d0bc-469a-b7ae-b83c965fa7a3","Type":"ContainerStarted","Data":"258edd9c30fe1dbe17891d053072369d46f939be7353e6ce2e3e9d38fc056102"} Apr 16 14:00:10.821704 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.821613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56b5bd7874-fvchd" event={"ID":"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91","Type":"ContainerStarted","Data":"0217e84805724ebffdb6c260e195723779c157fade8944c570f00fd77bc80bdd"} Apr 16 14:00:10.821704 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.821653 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56b5bd7874-fvchd" event={"ID":"fc51c8a5-99bf-42f4-9371-6d41a4f0fc91","Type":"ContainerStarted","Data":"a67d908cd48c2f9cbe9617cde2d069fb02638887dcff237533bf28379becb227"} Apr 16 14:00:10.843150 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:10.843098 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-56b5bd7874-fvchd" podStartSLOduration=60.843080267 podStartE2EDuration="1m0.843080267s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:10.841410235 +0000 UTC m=+66.885443422" watchObservedRunningTime="2026-04-16 14:00:10.843080267 +0000 UTC m=+66.887113456" Apr 16 14:00:11.469747 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.469683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:11.473244 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.473046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:11.737703 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.737601 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-5ndkj"] Apr 16 14:00:11.741181 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.741127 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c7499dc6f-xpsq5"] Apr 16 14:00:11.741380 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.741296 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:11.744132 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.744112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bwhtn\"" Apr 16 14:00:11.744248 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.744231 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:00:11.744406 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.744329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:11.744859 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.744829 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:00:11.748460 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.748424 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sb62k\"" Apr 16 14:00:11.749190 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.749140 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:00:11.749472 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.749446 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:00:11.749555 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.749535 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:00:11.749630 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.749561 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:00:11.751143 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.751126 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:00:11.757896 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.757835 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-5ndkj"] Apr 16 14:00:11.759596 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.759569 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7499dc6f-xpsq5"] Apr 16 14:00:11.824211 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.824179 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:11.825556 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.825520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-56b5bd7874-fvchd" Apr 16 14:00:11.843976 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.843934 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8mgk9"] Apr 16 14:00:11.848747 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.848723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:11.851485 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.851382 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:11.851485 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.851396 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h5f2n\"" Apr 16 14:00:11.851485 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.851386 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:11.860016 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.859994 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8mgk9"] Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.908897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-oauth-config\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.908993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8kl\" (UniqueName: \"kubernetes.io/projected/69eadc2d-600e-416d-90d1-59a54327ba85-kube-api-access-rn8kl\") pod \"downloads-586b57c7b4-5ndkj\" (UID: \"69eadc2d-600e-416d-90d1-59a54327ba85\") " pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.909042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmkdl\" (UniqueName: \"kubernetes.io/projected/02e8748e-4690-45a0-9b95-5651f7f2d662-kube-api-access-lmkdl\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.909104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-console-config\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.909131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-oauth-serving-cert\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.909162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-serving-cert\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:11.909227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:11.909184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-service-ca\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010062 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.009980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmkdl\" (UniqueName: \"kubernetes.io/projected/02e8748e-4690-45a0-9b95-5651f7f2d662-kube-api-access-lmkdl\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010062 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-console-config\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010062 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-oauth-serving-cert\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-serving-cert\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-service-ca\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-oauth-config\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8tb\" (UniqueName: \"kubernetes.io/projected/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-kube-api-access-5v8tb\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-data-volume\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8kl\" (UniqueName: \"kubernetes.io/projected/69eadc2d-600e-416d-90d1-59a54327ba85-kube-api-access-rn8kl\") pod \"downloads-586b57c7b4-5ndkj\" (UID: \"69eadc2d-600e-416d-90d1-59a54327ba85\") " pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:12.010328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-crio-socket\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.010725 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.010343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.011113 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.011047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-console-config\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.011234 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.011198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-oauth-serving-cert\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.011289 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.011257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-service-ca\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.013778 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.013318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-serving-cert\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.013778 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.013732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-oauth-config\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.022167 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.022139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8kl\" (UniqueName: \"kubernetes.io/projected/69eadc2d-600e-416d-90d1-59a54327ba85-kube-api-access-rn8kl\") pod \"downloads-586b57c7b4-5ndkj\" (UID: \"69eadc2d-600e-416d-90d1-59a54327ba85\") " pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:12.022418 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.022396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmkdl\" (UniqueName: \"kubernetes.io/projected/02e8748e-4690-45a0-9b95-5651f7f2d662-kube-api-access-lmkdl\") pod \"console-6c7499dc6f-xpsq5\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.057292 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.057227 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:12.064150 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.064119 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:12.111562 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.111528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.111562 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.111568 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8tb\" (UniqueName: \"kubernetes.io/projected/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-kube-api-access-5v8tb\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.111780 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.111724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-data-volume\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.111843 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.111779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-crio-socket\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.111843 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.111809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.111951 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.111887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-crio-socket\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.112052 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.112034 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-data-volume\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.112493 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.112470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.113835 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.113803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.121227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.121202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8tb\" (UniqueName: \"kubernetes.io/projected/9e7cd354-7d8c-4218-9aa9-27bebdc77ec8-kube-api-access-5v8tb\") pod \"insights-runtime-extractor-8mgk9\" (UID: \"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8\") " pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:12.161457 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:12.161422 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8mgk9" Apr 16 14:00:13.590710 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.590143 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-5ndkj"] Apr 16 14:00:13.594414 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:13.594343 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69eadc2d_600e_416d_90d1_59a54327ba85.slice/crio-ef83add981f79012a340fb1dcc7877deee9439986eece38386805f8b92351ac9 WatchSource:0}: Error finding container ef83add981f79012a340fb1dcc7877deee9439986eece38386805f8b92351ac9: Status 404 returned error can't find the container with id ef83add981f79012a340fb1dcc7877deee9439986eece38386805f8b92351ac9 Apr 16 14:00:13.618431 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.615814 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7499dc6f-xpsq5"] Apr 16 14:00:13.621524 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:13.621487 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e8748e_4690_45a0_9b95_5651f7f2d662.slice/crio-e948fc27fd5aa01ffca7ec10bd7b918d42ea5c06660b607aa64a160040d59fbe WatchSource:0}: Error finding container e948fc27fd5aa01ffca7ec10bd7b918d42ea5c06660b607aa64a160040d59fbe: Status 404 returned error can't find the container with id e948fc27fd5aa01ffca7ec10bd7b918d42ea5c06660b607aa64a160040d59fbe Apr 16 14:00:13.642783 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.642725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8mgk9"] Apr 16 14:00:13.649937 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:13.649912 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e7cd354_7d8c_4218_9aa9_27bebdc77ec8.slice/crio-acb32696224b498d0dc1f0996c3d7101b44b82ff8d2ffacc1209cd4c6909476c WatchSource:0}: Error finding container acb32696224b498d0dc1f0996c3d7101b44b82ff8d2ffacc1209cd4c6909476c: Status 404 returned error can't find the container with id acb32696224b498d0dc1f0996c3d7101b44b82ff8d2ffacc1209cd4c6909476c Apr 16 14:00:13.831509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.831480 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kqzcz" event={"ID":"a8fbe668-dcf2-4f5d-97a8-f28c95ce8261","Type":"ContainerStarted","Data":"aab9ef210155726f6cf3e150835aa303656814155bc6b423cd434d04ea4d6b87"} Apr 16 14:00:13.832814 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.832791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8ptvc" event={"ID":"2e37e261-0537-4203-b8fd-2b1189bee139","Type":"ContainerStarted","Data":"83c47c79174776fcae83166072509bae0415ff2e6171f232e7ae73d768328c6f"} Apr 16 14:00:13.834025 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.833979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-59j5c" event={"ID":"45ca9ed1-9528-4529-8ffc-64027bd9e40a","Type":"ContainerStarted","Data":"5a76dae81c60bd8bcb7f776162770812fd0dcc939a8a0974571c52d010b69b4c"} Apr 16 14:00:13.835436 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.835413 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" event={"ID":"9f9e0f27-a5f3-44ee-9f24-aec06b0aa130","Type":"ContainerStarted","Data":"735be9d2ae3eb6925b13e2d39040ff887a09bb95d85e6acafcc1b37d99ca6cd2"} Apr 16 14:00:13.836389 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.836370 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7499dc6f-xpsq5" event={"ID":"02e8748e-4690-45a0-9b95-5651f7f2d662","Type":"ContainerStarted","Data":"e948fc27fd5aa01ffca7ec10bd7b918d42ea5c06660b607aa64a160040d59fbe"} Apr 16 14:00:13.838920 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.838898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-5ndkj" event={"ID":"69eadc2d-600e-416d-90d1-59a54327ba85","Type":"ContainerStarted","Data":"ef83add981f79012a340fb1dcc7877deee9439986eece38386805f8b92351ac9"} Apr 16 14:00:13.841270 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.841155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" event={"ID":"60e08263-d0bc-469a-b7ae-b83c965fa7a3","Type":"ContainerStarted","Data":"e61f74a294d58a992167b6133295e0f9fad95b3856aaee960ec3ecc460071e94"} Apr 16 14:00:13.843050 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.843029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mgk9" event={"ID":"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8","Type":"ContainerStarted","Data":"f55637bf71e638035d2ce1116b92319a512aa8299ef0a2d0843b301dd1c0057b"} Apr 16 14:00:13.843254 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.843222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mgk9" event={"ID":"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8","Type":"ContainerStarted","Data":"acb32696224b498d0dc1f0996c3d7101b44b82ff8d2ffacc1209cd4c6909476c"} Apr 16 14:00:13.852526 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.852489 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kqzcz" podStartSLOduration=33.190938059 podStartE2EDuration="36.852473883s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 14:00:09.726416859 +0000 UTC m=+65.770450027" lastFinishedPulling="2026-04-16 14:00:13.387952681 +0000 UTC m=+69.431985851" observedRunningTime="2026-04-16 14:00:13.851603756 +0000 UTC m=+69.895636949" watchObservedRunningTime="2026-04-16 14:00:13.852473883 +0000 UTC m=+69.896507107" Apr 16 14:00:13.897535 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.897480 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6gtrd" podStartSLOduration=60.44424154 podStartE2EDuration="1m3.897459329s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 14:00:09.931307305 +0000 UTC m=+65.975340469" lastFinishedPulling="2026-04-16 14:00:13.384525077 +0000 UTC m=+69.428558258" observedRunningTime="2026-04-16 14:00:13.877785679 +0000 UTC m=+69.921818866" watchObservedRunningTime="2026-04-16 14:00:13.897459329 +0000 UTC m=+69.941492516" Apr 16 14:00:13.898280 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:13.898238 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ngv2d" podStartSLOduration=60.157932866 podStartE2EDuration="1m3.898222288s" podCreationTimestamp="2026-04-16 13:59:10 +0000 UTC" firstStartedPulling="2026-04-16 14:00:09.644224043 +0000 UTC m=+65.688257210" lastFinishedPulling="2026-04-16 14:00:13.384513449 +0000 UTC m=+69.428546632" observedRunningTime="2026-04-16 14:00:13.897304523 +0000 UTC m=+69.941337709" watchObservedRunningTime="2026-04-16 14:00:13.898222288 +0000 UTC m=+69.942255476" Apr 16 14:00:14.848412 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:14.848367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8ptvc" event={"ID":"2e37e261-0537-4203-b8fd-2b1189bee139","Type":"ContainerStarted","Data":"79b72fbfdce89f07665eb1bd1209a201be22b6ba1a687eaa1b7d70e3b0901765"} Apr 16 14:00:14.848878 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:14.848516 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8ptvc" Apr 16 14:00:14.850244 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:14.850210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-59j5c" event={"ID":"45ca9ed1-9528-4529-8ffc-64027bd9e40a","Type":"ContainerStarted","Data":"e78b13439a733df197d49283fdedc8de8a0ac20b8036795deffdfd4c4585ca0d"} Apr 16 14:00:14.867483 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:14.867441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8ptvc" podStartSLOduration=34.418841999 podStartE2EDuration="37.867428345s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 14:00:09.935926129 +0000 UTC m=+65.979959294" lastFinishedPulling="2026-04-16 14:00:13.384512462 +0000 UTC m=+69.428545640" observedRunningTime="2026-04-16 14:00:14.864737919 +0000 UTC m=+70.908771104" watchObservedRunningTime="2026-04-16 14:00:14.867428345 +0000 UTC m=+70.911461530" Apr 16 14:00:14.880886 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:14.880838 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-59j5c" podStartSLOduration=67.948345955 podStartE2EDuration="1m10.880824064s" podCreationTimestamp="2026-04-16 13:59:04 +0000 UTC" firstStartedPulling="2026-04-16 14:00:10.457156572 +0000 UTC m=+66.501189738" lastFinishedPulling="2026-04-16 14:00:13.389634669 +0000 UTC m=+69.433667847" observedRunningTime="2026-04-16 14:00:14.879553458 +0000 UTC m=+70.923586644" watchObservedRunningTime="2026-04-16 14:00:14.880824064 +0000 UTC m=+70.924857251" Apr 16 14:00:15.855624 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:15.855585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mgk9" event={"ID":"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8","Type":"ContainerStarted","Data":"0b819d3b93f9b9a75bb7fae8c4bd43a17d3ca8770addea2df4960b592b959dbd"} Apr 16 14:00:17.729609 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:17.729577 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cb8c5" Apr 16 14:00:17.864253 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:17.864214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7499dc6f-xpsq5" event={"ID":"02e8748e-4690-45a0-9b95-5651f7f2d662","Type":"ContainerStarted","Data":"9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89"} Apr 16 14:00:17.884929 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:17.884871 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c7499dc6f-xpsq5" podStartSLOduration=3.346211686 podStartE2EDuration="6.884852941s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.626559901 +0000 UTC m=+69.670593065" lastFinishedPulling="2026-04-16 14:00:17.165201154 +0000 UTC m=+73.209234320" observedRunningTime="2026-04-16 14:00:17.883164273 +0000 UTC m=+73.927197459" watchObservedRunningTime="2026-04-16 14:00:17.884852941 +0000 UTC m=+73.928886129" Apr 16 14:00:19.873478 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:19.873432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mgk9" event={"ID":"9e7cd354-7d8c-4218-9aa9-27bebdc77ec8","Type":"ContainerStarted","Data":"19e1bbb2f1a8b0b953f2b5f86a22c4c5caf92fb51211f6e37dcb872c8c397385"} Apr 16 14:00:19.891901 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:19.891845 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8mgk9" podStartSLOduration=3.35366526 podStartE2EDuration="8.891828865s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.753427356 +0000 UTC m=+69.797460534" lastFinishedPulling="2026-04-16 14:00:19.29159096 +0000 UTC m=+75.335624139" observedRunningTime="2026-04-16 14:00:19.89065587 +0000 UTC m=+75.934689057" watchObservedRunningTime="2026-04-16 14:00:19.891828865 +0000 UTC m=+75.935862065" Apr 16 14:00:22.064532 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:22.064497 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:22.065025 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:22.064543 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:22.066197 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:22.066172 2573 patch_prober.go:28] interesting pod/console-6c7499dc6f-xpsq5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.22:8443/health\": dial tcp 10.132.0.22:8443: connect: connection refused" start-of-body= Apr 16 14:00:22.066321 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:22.066223 2573 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6c7499dc6f-xpsq5" podUID="02e8748e-4690-45a0-9b95-5651f7f2d662" containerName="console" probeResult="failure" output="Get \"https://10.132.0.22:8443/health\": dial tcp 10.132.0.22:8443: connect: connection refused" Apr 16 14:00:24.858464 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:24.858430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8ptvc" Apr 16 14:00:26.475079 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.475036 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ns2xt"] Apr 16 14:00:26.501350 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.501215 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.505208 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.505183 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:00:26.505362 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.505332 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:00:26.505629 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.505612 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:00:26.505734 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.505697 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:00:26.505798 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.505611 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nmjsm\"" Apr 16 14:00:26.548274 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-textfile\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-wtmp\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-tls\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmtf\" (UniqueName: \"kubernetes.io/projected/06256a7c-b1ff-449e-8280-66dc210fe78d-kube-api-access-vwmtf\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-root\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548702 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-sys\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548702 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548542 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548702 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06256a7c-b1ff-449e-8280-66dc210fe78d-metrics-client-ca\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.548702 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.548589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-accelerators-collector-config\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649773 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649731 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmtf\" (UniqueName: \"kubernetes.io/projected/06256a7c-b1ff-449e-8280-66dc210fe78d-kube-api-access-vwmtf\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-root\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-sys\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06256a7c-b1ff-449e-8280-66dc210fe78d-metrics-client-ca\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-accelerators-collector-config\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.649970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.649969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-textfile\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.650247 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.650015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-wtmp\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.650247 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.650042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-tls\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.650247 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:00:26.650185 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:00:26.650380 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:00:26.650250 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-tls podName:06256a7c-b1ff-449e-8280-66dc210fe78d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.150228912 +0000 UTC m=+83.194262082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-tls") pod "node-exporter-ns2xt" (UID: "06256a7c-b1ff-449e-8280-66dc210fe78d") : secret "node-exporter-tls" not found Apr 16 14:00:26.650752 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.650723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-root\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.650880 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.650775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-sys\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.651437 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.651414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-textfile\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.651738 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.651710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-wtmp\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.652477 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.651902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06256a7c-b1ff-449e-8280-66dc210fe78d-metrics-client-ca\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.652477 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.652051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-accelerators-collector-config\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.654411 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.654386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:26.670101 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:26.670046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmtf\" (UniqueName: \"kubernetes.io/projected/06256a7c-b1ff-449e-8280-66dc210fe78d-kube-api-access-vwmtf\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:27.155348 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:27.155313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-tls\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:27.157966 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:27.157942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06256a7c-b1ff-449e-8280-66dc210fe78d-node-exporter-tls\") pod \"node-exporter-ns2xt\" (UID: \"06256a7c-b1ff-449e-8280-66dc210fe78d\") " pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:27.421824 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:27.421742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ns2xt" Apr 16 14:00:29.399690 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:29.399542 2573 patch_prober.go:28] interesting pod/image-registry-9dc6b545d-grd2w container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:00:29.399690 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:29.399601 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" podUID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:00:30.826457 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:30.826429 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:00:31.255350 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:31.255259 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7499dc6f-xpsq5"] Apr 16 14:00:31.381066 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:00:31.381032 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06256a7c_b1ff_449e_8280_66dc210fe78d.slice/crio-de6c70cf4a3bf8c6bed30dc2b6a36e5b1950e423512f8d3430db5450e6fc19b0 WatchSource:0}: Error finding container de6c70cf4a3bf8c6bed30dc2b6a36e5b1950e423512f8d3430db5450e6fc19b0: Status 404 returned error can't find the container with id de6c70cf4a3bf8c6bed30dc2b6a36e5b1950e423512f8d3430db5450e6fc19b0 Apr 16 14:00:31.913754 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:31.913712 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-5ndkj" event={"ID":"69eadc2d-600e-416d-90d1-59a54327ba85","Type":"ContainerStarted","Data":"7e36115fb67fa031e3af8e287d8e09db558b1fa972a0b49eda60d25ff3c6c49d"} Apr 16 14:00:31.914431 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:31.914267 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:31.915249 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:31.915225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns2xt" event={"ID":"06256a7c-b1ff-449e-8280-66dc210fe78d","Type":"ContainerStarted","Data":"de6c70cf4a3bf8c6bed30dc2b6a36e5b1950e423512f8d3430db5450e6fc19b0"} Apr 16 14:00:31.930457 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:31.930410 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-5ndkj" podStartSLOduration=3.070870014 podStartE2EDuration="20.930394002s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.603965241 +0000 UTC m=+69.647998412" lastFinishedPulling="2026-04-16 14:00:31.463489221 +0000 UTC m=+87.507522400" observedRunningTime="2026-04-16 14:00:31.929145137 +0000 UTC m=+87.973178324" watchObservedRunningTime="2026-04-16 14:00:31.930394002 +0000 UTC m=+87.974427187" Apr 16 14:00:31.931527 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:31.931504 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-5ndkj" Apr 16 14:00:33.924825 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:33.924788 2573 generic.go:358] "Generic (PLEG): container finished" podID="06256a7c-b1ff-449e-8280-66dc210fe78d" containerID="302c90f36dc690b0febcc12d9389c55b696619ba16182cf364428db5157e9a50" exitCode=0 Apr 16 14:00:33.925293 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:33.924880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns2xt" event={"ID":"06256a7c-b1ff-449e-8280-66dc210fe78d","Type":"ContainerDied","Data":"302c90f36dc690b0febcc12d9389c55b696619ba16182cf364428db5157e9a50"} Apr 16 14:00:34.574601 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:34.574562 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9dc6b545d-grd2w"] Apr 16 14:00:34.930603 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:34.930511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns2xt" event={"ID":"06256a7c-b1ff-449e-8280-66dc210fe78d","Type":"ContainerStarted","Data":"f437aba56907ddf73fc17f76d63218bede3afadd6a563c48537053a3cfb6a7ae"} Apr 16 14:00:34.930603 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:34.930559 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns2xt" event={"ID":"06256a7c-b1ff-449e-8280-66dc210fe78d","Type":"ContainerStarted","Data":"af772f01159ca85dbaa7b224e458f7cfd1891953208e757929d8f3144b822c30"} Apr 16 14:00:34.948777 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:34.948722 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ns2xt" podStartSLOduration=6.733910446 podStartE2EDuration="8.948706608s" podCreationTimestamp="2026-04-16 14:00:26 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.382760812 +0000 UTC m=+87.426793979" lastFinishedPulling="2026-04-16 14:00:33.597556974 +0000 UTC m=+89.641590141" observedRunningTime="2026-04-16 14:00:34.947734307 +0000 UTC m=+90.991767492" watchObservedRunningTime="2026-04-16 14:00:34.948706608 +0000 UTC m=+90.992739844" Apr 16 14:00:56.277549 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.277488 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c7499dc6f-xpsq5" podUID="02e8748e-4690-45a0-9b95-5651f7f2d662" containerName="console" containerID="cri-o://9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89" gracePeriod=15 Apr 16 14:00:56.517041 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.517018 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7499dc6f-xpsq5_02e8748e-4690-45a0-9b95-5651f7f2d662/console/0.log" Apr 16 14:00:56.517168 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.517075 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:56.617181 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617149 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-oauth-serving-cert\") pod \"02e8748e-4690-45a0-9b95-5651f7f2d662\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " Apr 16 14:00:56.617374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617195 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmkdl\" (UniqueName: \"kubernetes.io/projected/02e8748e-4690-45a0-9b95-5651f7f2d662-kube-api-access-lmkdl\") pod \"02e8748e-4690-45a0-9b95-5651f7f2d662\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " Apr 16 14:00:56.617374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617219 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-console-config\") pod \"02e8748e-4690-45a0-9b95-5651f7f2d662\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " Apr 16 14:00:56.617374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617239 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-service-ca\") pod \"02e8748e-4690-45a0-9b95-5651f7f2d662\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " Apr 16 14:00:56.617374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617273 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-serving-cert\") pod \"02e8748e-4690-45a0-9b95-5651f7f2d662\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " Apr 16 14:00:56.617374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617347 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-oauth-config\") pod \"02e8748e-4690-45a0-9b95-5651f7f2d662\" (UID: \"02e8748e-4690-45a0-9b95-5651f7f2d662\") " Apr 16 14:00:56.617635 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617609 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-console-config" (OuterVolumeSpecName: "console-config") pod "02e8748e-4690-45a0-9b95-5651f7f2d662" (UID: "02e8748e-4690-45a0-9b95-5651f7f2d662"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:00:56.617788 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617752 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-service-ca" (OuterVolumeSpecName: "service-ca") pod "02e8748e-4690-45a0-9b95-5651f7f2d662" (UID: "02e8748e-4690-45a0-9b95-5651f7f2d662"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:00:56.617871 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.617803 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "02e8748e-4690-45a0-9b95-5651f7f2d662" (UID: "02e8748e-4690-45a0-9b95-5651f7f2d662"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:00:56.619529 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.619494 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "02e8748e-4690-45a0-9b95-5651f7f2d662" (UID: "02e8748e-4690-45a0-9b95-5651f7f2d662"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:00:56.619529 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.619523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "02e8748e-4690-45a0-9b95-5651f7f2d662" (UID: "02e8748e-4690-45a0-9b95-5651f7f2d662"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:00:56.619724 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.619578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e8748e-4690-45a0-9b95-5651f7f2d662-kube-api-access-lmkdl" (OuterVolumeSpecName: "kube-api-access-lmkdl") pod "02e8748e-4690-45a0-9b95-5651f7f2d662" (UID: "02e8748e-4690-45a0-9b95-5651f7f2d662"). InnerVolumeSpecName "kube-api-access-lmkdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:00:56.718157 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.718122 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-oauth-config\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:00:56.718157 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.718152 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-oauth-serving-cert\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:00:56.718157 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.718165 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lmkdl\" (UniqueName: \"kubernetes.io/projected/02e8748e-4690-45a0-9b95-5651f7f2d662-kube-api-access-lmkdl\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:00:56.718394 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.718181 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-console-config\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:00:56.718394 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.718191 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02e8748e-4690-45a0-9b95-5651f7f2d662-service-ca\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:00:56.718394 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:56.718201 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02e8748e-4690-45a0-9b95-5651f7f2d662-console-serving-cert\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:00:57.000943 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.000867 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7499dc6f-xpsq5_02e8748e-4690-45a0-9b95-5651f7f2d662/console/0.log" Apr 16 14:00:57.000943 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.000915 2573 generic.go:358] "Generic (PLEG): container finished" podID="02e8748e-4690-45a0-9b95-5651f7f2d662" containerID="9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89" exitCode=2 Apr 16 14:00:57.001124 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.000971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7499dc6f-xpsq5" event={"ID":"02e8748e-4690-45a0-9b95-5651f7f2d662","Type":"ContainerDied","Data":"9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89"} Apr 16 14:00:57.001124 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.000987 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7499dc6f-xpsq5" Apr 16 14:00:57.001124 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.001007 2573 scope.go:117] "RemoveContainer" containerID="9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89" Apr 16 14:00:57.001124 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.000997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7499dc6f-xpsq5" event={"ID":"02e8748e-4690-45a0-9b95-5651f7f2d662","Type":"ContainerDied","Data":"e948fc27fd5aa01ffca7ec10bd7b918d42ea5c06660b607aa64a160040d59fbe"} Apr 16 14:00:57.009388 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.009370 2573 scope.go:117] "RemoveContainer" containerID="9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89" Apr 16 14:00:57.009636 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:00:57.009618 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89\": container with ID starting with 9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89 not found: ID does not exist" containerID="9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89" Apr 16 14:00:57.009736 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.009644 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89"} err="failed to get container status \"9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89\": rpc error: code = NotFound desc = could not find container \"9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89\": container with ID starting with 9943be65ee7c69e08b54c9767645da392c43280b1a22c9606c2555993e245b89 not found: ID does not exist" Apr 16 14:00:57.022452 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.022425 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7499dc6f-xpsq5"] Apr 16 14:00:57.024213 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:57.024192 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c7499dc6f-xpsq5"] Apr 16 14:00:58.516053 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:58.516019 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e8748e-4690-45a0-9b95-5651f7f2d662" path="/var/lib/kubelet/pods/02e8748e-4690-45a0-9b95-5651f7f2d662/volumes" Apr 16 14:00:59.601577 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:59.601518 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" podUID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" containerName="registry" containerID="cri-o://694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de" gracePeriod=30 Apr 16 14:00:59.862051 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:00:59.861992 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:01:00.011626 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.011590 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" containerID="694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de" exitCode=0 Apr 16 14:01:00.011826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.011652 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" Apr 16 14:01:00.011826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.011658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" event={"ID":"d7890347-e1d0-4b03-8b89-c727fa4c4f18","Type":"ContainerDied","Data":"694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de"} Apr 16 14:01:00.011826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.011716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9dc6b545d-grd2w" event={"ID":"d7890347-e1d0-4b03-8b89-c727fa4c4f18","Type":"ContainerDied","Data":"f44fb8a30a599d0fde5a497e003c1c7829be73342ca31ee78548280b71ef920b"} Apr 16 14:01:00.011826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.011736 2573 scope.go:117] "RemoveContainer" containerID="694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de" Apr 16 14:01:00.019088 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.019073 2573 scope.go:117] "RemoveContainer" containerID="694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de" Apr 16 14:01:00.019379 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:01:00.019361 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de\": container with ID starting with 694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de not found: ID does not exist" containerID="694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de" Apr 16 14:01:00.019435 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.019388 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de"} err="failed to get container status \"694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de\": rpc error: code = NotFound desc = could not find container \"694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de\": container with ID starting with 694778d8b445860e3eedd2c5204700840bad1f731752fab726d75aa0ba9ae9de not found: ID does not exist" Apr 16 14:01:00.045703 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045658 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-bound-sa-token\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.045788 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045723 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-trusted-ca\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.045788 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.045788 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045760 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-image-registry-private-configuration\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.045788 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045788 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jxnd\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-kube-api-access-6jxnd\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.046046 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045806 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-certificates\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.046046 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.045835 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7890347-e1d0-4b03-8b89-c727fa4c4f18-ca-trust-extracted\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.046187 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.046156 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:00.046256 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.046193 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:00.046312 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.046291 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-installation-pull-secrets\") pod \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\" (UID: \"d7890347-e1d0-4b03-8b89-c727fa4c4f18\") " Apr 16 14:01:00.046581 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.046549 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-certificates\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.046660 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.046580 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7890347-e1d0-4b03-8b89-c727fa4c4f18-trusted-ca\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.048303 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.048222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:00.048303 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.048228 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:00.048466 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.048333 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:00.048519 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.048479 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-kube-api-access-6jxnd" (OuterVolumeSpecName: "kube-api-access-6jxnd") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "kube-api-access-6jxnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:00.048768 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.048749 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:00.054562 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.054539 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7890347-e1d0-4b03-8b89-c727fa4c4f18-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d7890347-e1d0-4b03-8b89-c727fa4c4f18" (UID: "d7890347-e1d0-4b03-8b89-c727fa4c4f18"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:00.147311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.147211 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-bound-sa-token\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.147311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.147248 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-registry-tls\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.147311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.147262 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-image-registry-private-configuration\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.147311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.147274 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jxnd\" (UniqueName: \"kubernetes.io/projected/d7890347-e1d0-4b03-8b89-c727fa4c4f18-kube-api-access-6jxnd\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.147311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.147287 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7890347-e1d0-4b03-8b89-c727fa4c4f18-ca-trust-extracted\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.147311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.147300 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7890347-e1d0-4b03-8b89-c727fa4c4f18-installation-pull-secrets\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.338292 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.338255 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9dc6b545d-grd2w"] Apr 16 14:01:00.343181 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.343156 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9dc6b545d-grd2w"] Apr 16 14:01:00.516164 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:00.516086 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" path="/var/lib/kubelet/pods/d7890347-e1d0-4b03-8b89-c727fa4c4f18/volumes" Apr 16 14:01:03.021913 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:03.021882 2573 generic.go:358] "Generic (PLEG): container finished" podID="027223a7-23fc-4750-bd60-a91d4fbb3300" containerID="0dcd71df7169eb27df415a60e5dd2a4a00a3389c4f5e8c0542c06138d787c75e" exitCode=0 Apr 16 14:01:03.022359 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:03.021932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" event={"ID":"027223a7-23fc-4750-bd60-a91d4fbb3300","Type":"ContainerDied","Data":"0dcd71df7169eb27df415a60e5dd2a4a00a3389c4f5e8c0542c06138d787c75e"} Apr 16 14:01:03.022359 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:03.022231 2573 scope.go:117] "RemoveContainer" containerID="0dcd71df7169eb27df415a60e5dd2a4a00a3389c4f5e8c0542c06138d787c75e" Apr 16 14:01:04.026740 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:04.026705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-845gn" event={"ID":"027223a7-23fc-4750-bd60-a91d4fbb3300","Type":"ContainerStarted","Data":"775215a6119e2f4d355bf2ec3467dcf4449ae41bac30b82ad184e5027327bc74"} Apr 16 14:01:09.040832 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:09.040796 2573 generic.go:358] "Generic (PLEG): container finished" podID="26ed3c5b-e875-42d2-b496-1c3aacfc5b95" containerID="19bc4f1445ba644ced630ecaf5477b4e892954da30ef764dcf53b56b25dc17e3" exitCode=0 Apr 16 14:01:09.041208 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:09.040874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" event={"ID":"26ed3c5b-e875-42d2-b496-1c3aacfc5b95","Type":"ContainerDied","Data":"19bc4f1445ba644ced630ecaf5477b4e892954da30ef764dcf53b56b25dc17e3"} Apr 16 14:01:09.041264 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:09.041246 2573 scope.go:117] "RemoveContainer" containerID="19bc4f1445ba644ced630ecaf5477b4e892954da30ef764dcf53b56b25dc17e3" Apr 16 14:01:10.044808 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:10.044771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-c2xv7" event={"ID":"26ed3c5b-e875-42d2-b496-1c3aacfc5b95","Type":"ContainerStarted","Data":"87cc54b116750880cd65a4234e4303bd52e7a8b298182bbfe69b93ce539bf0ec"} Apr 16 14:01:17.063883 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:17.063845 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f6b46fd-2e36-47ee-945a-f745461401d6" containerID="64bff32d5615684a6aa9f5b42cbe05409585f3584a7fc49bf42ecb871cba3f25" exitCode=0 Apr 16 14:01:17.064359 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:17.063918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" event={"ID":"8f6b46fd-2e36-47ee-945a-f745461401d6","Type":"ContainerDied","Data":"64bff32d5615684a6aa9f5b42cbe05409585f3584a7fc49bf42ecb871cba3f25"} Apr 16 14:01:17.064359 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:17.064271 2573 scope.go:117] "RemoveContainer" containerID="64bff32d5615684a6aa9f5b42cbe05409585f3584a7fc49bf42ecb871cba3f25" Apr 16 14:01:18.068537 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:01:18.068504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5rth9" event={"ID":"8f6b46fd-2e36-47ee-945a-f745461401d6","Type":"ContainerStarted","Data":"d25feaf0e25e8134c7b0af5cc9d482e6bad3163c57bab03899c872cace7da741"} Apr 16 14:03:53.287120 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287079 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn"] Apr 16 14:03:53.287546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287394 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e8748e-4690-45a0-9b95-5651f7f2d662" containerName="console" Apr 16 14:03:53.287546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287406 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e8748e-4690-45a0-9b95-5651f7f2d662" containerName="console" Apr 16 14:03:53.287546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287417 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" containerName="registry" Apr 16 14:03:53.287546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287423 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" containerName="registry" Apr 16 14:03:53.287546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287478 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7890347-e1d0-4b03-8b89-c727fa4c4f18" containerName="registry" Apr 16 14:03:53.287546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.287490 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="02e8748e-4690-45a0-9b95-5651f7f2d662" containerName="console" Apr 16 14:03:53.290411 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.290395 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.293073 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.293050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:03:53.294328 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.294310 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-xsdxt\"" Apr 16 14:03:53.294437 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.294362 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:03:53.297477 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.297430 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn"] Apr 16 14:03:53.359820 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.359782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwfpc\" (UniqueName: \"kubernetes.io/projected/4bc4b247-d702-48ef-8979-21556ed10ba7-kube-api-access-zwfpc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.359998 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.359840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.359998 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.359899 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.460970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.460939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.461161 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.461023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.461161 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.461059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwfpc\" (UniqueName: \"kubernetes.io/projected/4bc4b247-d702-48ef-8979-21556ed10ba7-kube-api-access-zwfpc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.461370 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.461349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.461438 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.461384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.471032 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.470996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwfpc\" (UniqueName: \"kubernetes.io/projected/4bc4b247-d702-48ef-8979-21556ed10ba7-kube-api-access-zwfpc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.599935 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.599904 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:03:53.723098 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:53.723065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn"] Apr 16 14:03:53.725573 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:03:53.725547 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc4b247_d702_48ef_8979_21556ed10ba7.slice/crio-7454dcf6697353438745a906fd33f03d24d334a9b7b056a344d94f0719c28954 WatchSource:0}: Error finding container 7454dcf6697353438745a906fd33f03d24d334a9b7b056a344d94f0719c28954: Status 404 returned error can't find the container with id 7454dcf6697353438745a906fd33f03d24d334a9b7b056a344d94f0719c28954 Apr 16 14:03:54.520480 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:54.520442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" event={"ID":"4bc4b247-d702-48ef-8979-21556ed10ba7","Type":"ContainerStarted","Data":"7454dcf6697353438745a906fd33f03d24d334a9b7b056a344d94f0719c28954"} Apr 16 14:03:59.531504 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:59.531465 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerID="1a7e827ef6c47c87da8d3c8f88e6ba57855bab6bcd20294a8f5801ab770b0aa4" exitCode=0 Apr 16 14:03:59.531945 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:03:59.531521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" event={"ID":"4bc4b247-d702-48ef-8979-21556ed10ba7","Type":"ContainerDied","Data":"1a7e827ef6c47c87da8d3c8f88e6ba57855bab6bcd20294a8f5801ab770b0aa4"} Apr 16 14:04:02.543689 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:02.543633 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerID="04b0b46e55b4b028acf52d9e68eafa52ef16c32cf06d9f32764ca4599248ccec" exitCode=0 Apr 16 14:04:02.544063 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:02.543708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" event={"ID":"4bc4b247-d702-48ef-8979-21556ed10ba7","Type":"ContainerDied","Data":"04b0b46e55b4b028acf52d9e68eafa52ef16c32cf06d9f32764ca4599248ccec"} Apr 16 14:04:04.430531 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:04.430498 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:04:04.431163 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:04.431137 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:04:04.435176 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:04.435153 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:04:04.435403 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:04.435384 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:04:04.443427 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:04.443397 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:04:09.564595 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:09.564554 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerID="fa314ec575fa8402e27aea2f06a0041fe3a18ba95b1ee2358cb6c03c0ac0c217" exitCode=0 Apr 16 14:04:09.564595 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:09.564600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" event={"ID":"4bc4b247-d702-48ef-8979-21556ed10ba7","Type":"ContainerDied","Data":"fa314ec575fa8402e27aea2f06a0041fe3a18ba95b1ee2358cb6c03c0ac0c217"} Apr 16 14:04:10.692642 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.692620 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:04:10.798715 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.798651 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwfpc\" (UniqueName: \"kubernetes.io/projected/4bc4b247-d702-48ef-8979-21556ed10ba7-kube-api-access-zwfpc\") pod \"4bc4b247-d702-48ef-8979-21556ed10ba7\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " Apr 16 14:04:10.798931 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.798742 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-util\") pod \"4bc4b247-d702-48ef-8979-21556ed10ba7\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " Apr 16 14:04:10.798931 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.798798 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-bundle\") pod \"4bc4b247-d702-48ef-8979-21556ed10ba7\" (UID: \"4bc4b247-d702-48ef-8979-21556ed10ba7\") " Apr 16 14:04:10.799327 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.799292 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-bundle" (OuterVolumeSpecName: "bundle") pod "4bc4b247-d702-48ef-8979-21556ed10ba7" (UID: "4bc4b247-d702-48ef-8979-21556ed10ba7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:04:10.801026 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.800999 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc4b247-d702-48ef-8979-21556ed10ba7-kube-api-access-zwfpc" (OuterVolumeSpecName: "kube-api-access-zwfpc") pod "4bc4b247-d702-48ef-8979-21556ed10ba7" (UID: "4bc4b247-d702-48ef-8979-21556ed10ba7"). InnerVolumeSpecName "kube-api-access-zwfpc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:04:10.803136 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.803114 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-util" (OuterVolumeSpecName: "util") pod "4bc4b247-d702-48ef-8979-21556ed10ba7" (UID: "4bc4b247-d702-48ef-8979-21556ed10ba7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:04:10.899363 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.899273 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-bundle\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:04:10.899363 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.899306 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwfpc\" (UniqueName: \"kubernetes.io/projected/4bc4b247-d702-48ef-8979-21556ed10ba7-kube-api-access-zwfpc\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:04:10.899363 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:10.899316 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bc4b247-d702-48ef-8979-21556ed10ba7-util\") on node \"ip-10-0-130-195.ec2.internal\" DevicePath \"\"" Apr 16 14:04:11.572189 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:11.572156 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" Apr 16 14:04:11.572354 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:11.572157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29crx5fn" event={"ID":"4bc4b247-d702-48ef-8979-21556ed10ba7","Type":"ContainerDied","Data":"7454dcf6697353438745a906fd33f03d24d334a9b7b056a344d94f0719c28954"} Apr 16 14:04:11.572354 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:11.572271 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7454dcf6697353438745a906fd33f03d24d334a9b7b056a344d94f0719c28954" Apr 16 14:04:14.930197 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930158 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d"] Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930442 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="pull" Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930453 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="pull" Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930460 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="extract" Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930465 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="extract" Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930500 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="util" Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930506 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="util" Apr 16 14:04:14.930650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.930554 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bc4b247-d702-48ef-8979-21556ed10ba7" containerName="extract" Apr 16 14:04:14.934538 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.934521 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:14.937365 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.937343 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 14:04:14.937505 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.937448 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-t9zkz\"" Apr 16 14:04:14.937566 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.937545 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:04:14.937646 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.937633 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:04:14.945392 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:14.945369 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d"] Apr 16 14:04:15.033839 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.033804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4g7\" (UniqueName: \"kubernetes.io/projected/a77f657e-5f0c-4111-950a-4a1b3b4762a6-kube-api-access-vq4g7\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jb82d\" (UID: \"a77f657e-5f0c-4111-950a-4a1b3b4762a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.034020 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.033882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a77f657e-5f0c-4111-950a-4a1b3b4762a6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jb82d\" (UID: \"a77f657e-5f0c-4111-950a-4a1b3b4762a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.134990 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.134964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a77f657e-5f0c-4111-950a-4a1b3b4762a6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jb82d\" (UID: \"a77f657e-5f0c-4111-950a-4a1b3b4762a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.135131 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.135000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq4g7\" (UniqueName: \"kubernetes.io/projected/a77f657e-5f0c-4111-950a-4a1b3b4762a6-kube-api-access-vq4g7\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jb82d\" (UID: \"a77f657e-5f0c-4111-950a-4a1b3b4762a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.137278 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.137248 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a77f657e-5f0c-4111-950a-4a1b3b4762a6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jb82d\" (UID: \"a77f657e-5f0c-4111-950a-4a1b3b4762a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.143774 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.143751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq4g7\" (UniqueName: \"kubernetes.io/projected/a77f657e-5f0c-4111-950a-4a1b3b4762a6-kube-api-access-vq4g7\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jb82d\" (UID: \"a77f657e-5f0c-4111-950a-4a1b3b4762a6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.244963 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.244864 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:15.368632 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.368600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d"] Apr 16 14:04:15.372272 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:04:15.372241 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda77f657e_5f0c_4111_950a_4a1b3b4762a6.slice/crio-feb9e03e3aa99294a392f4e090dccd76b9cada607954ec6347c32396216e3c67 WatchSource:0}: Error finding container feb9e03e3aa99294a392f4e090dccd76b9cada607954ec6347c32396216e3c67: Status 404 returned error can't find the container with id feb9e03e3aa99294a392f4e090dccd76b9cada607954ec6347c32396216e3c67 Apr 16 14:04:15.374311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.374291 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:04:15.584745 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:15.584708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" event={"ID":"a77f657e-5f0c-4111-950a-4a1b3b4762a6","Type":"ContainerStarted","Data":"feb9e03e3aa99294a392f4e090dccd76b9cada607954ec6347c32396216e3c67"} Apr 16 14:04:19.257301 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.257265 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lh6nm"] Apr 16 14:04:19.260739 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.260714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.265000 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:19.264967 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"keda-operator-dockercfg-qvlgf\" is forbidden: User \"system:node:ip-10-0-130-195.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-keda\": no relationship found between node 'ip-10-0-130-195.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-qvlgf\"" type="*v1.Secret" Apr 16 14:04:19.265140 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:19.265062 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"keda-ocp-cabundle\" is forbidden: User \"system:node:ip-10-0-130-195.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-keda\": no relationship found between node 'ip-10-0-130-195.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" type="*v1.ConfigMap" Apr 16 14:04:19.265324 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.265303 2573 status_manager.go:895] "Failed to get status for pod" podUID="24327a8f-498a-4c99-990b-9970be56c7ba" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" err="pods \"keda-operator-ffbb595cb-lh6nm\" is forbidden: User \"system:node:ip-10-0-130-195.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-keda\": no relationship found between node 'ip-10-0-130-195.ec2.internal' and this object" Apr 16 14:04:19.265488 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:19.265465 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"keda-operator-certs\" is forbidden: User \"system:node:ip-10-0-130-195.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-keda\": no relationship found between node 'ip-10-0-130-195.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" type="*v1.Secret" Apr 16 14:04:19.278271 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.278248 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lh6nm"] Apr 16 14:04:19.371905 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.371867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/24327a8f-498a-4c99-990b-9970be56c7ba-cabundle0\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.372104 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.371920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpd8\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-kube-api-access-bvpd8\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.372104 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.372025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.473236 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.473198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/24327a8f-498a-4c99-990b-9970be56c7ba-cabundle0\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.473439 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.473257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpd8\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-kube-api-access-bvpd8\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.473439 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.473295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.481313 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.481286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpd8\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-kube-api-access-bvpd8\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:19.600715 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.600659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" event={"ID":"a77f657e-5f0c-4111-950a-4a1b3b4762a6","Type":"ContainerStarted","Data":"a0ccbb3f90ac7a922aef6ecbbdff8ad31dbe1a4eb199b45bb91f6d61f8977c9c"} Apr 16 14:04:19.600963 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.600938 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:19.600963 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.600963 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj"] Apr 16 14:04:19.604368 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.604345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.606632 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.606612 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 14:04:19.611939 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.611916 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj"] Apr 16 14:04:19.654809 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.654762 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" podStartSLOduration=2.2756995780000002 podStartE2EDuration="5.654747216s" podCreationTimestamp="2026-04-16 14:04:14 +0000 UTC" firstStartedPulling="2026-04-16 14:04:15.374476284 +0000 UTC m=+311.418509462" lastFinishedPulling="2026-04-16 14:04:18.75352392 +0000 UTC m=+314.797557100" observedRunningTime="2026-04-16 14:04:19.636978359 +0000 UTC m=+315.681011536" watchObservedRunningTime="2026-04-16 14:04:19.654747216 +0000 UTC m=+315.698780401" Apr 16 14:04:19.674967 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.674926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e27ea9c5-3675-4a92-a609-399c4f07260a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.675154 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.675113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9mm\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-kube-api-access-5f9mm\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.675234 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.675176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.775615 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.775575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9mm\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-kube-api-access-5f9mm\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.775851 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.775626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.775851 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.775649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e27ea9c5-3675-4a92-a609-399c4f07260a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.775851 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:19.775759 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:04:19.775851 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:19.775778 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:04:19.775851 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:19.775793 2573 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 14:04:19.776062 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.776011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e27ea9c5-3675-4a92-a609-399c4f07260a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.784321 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.784290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9mm\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-kube-api-access-5f9mm\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:19.981735 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.981636 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-t8xln"] Apr 16 14:04:19.985002 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.984984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:19.987510 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.987490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 14:04:19.992323 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:19.992223 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t8xln"] Apr 16 14:04:20.078948 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.078915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cec6f6b2-5658-413e-8b30-f2b16ef5f655-certificates\") pod \"keda-admission-cf49989db-t8xln\" (UID: \"cec6f6b2-5658-413e-8b30-f2b16ef5f655\") " pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.079139 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.078996 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66xw\" (UniqueName: \"kubernetes.io/projected/cec6f6b2-5658-413e-8b30-f2b16ef5f655-kube-api-access-f66xw\") pod \"keda-admission-cf49989db-t8xln\" (UID: \"cec6f6b2-5658-413e-8b30-f2b16ef5f655\") " pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.171917 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.171888 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 14:04:20.173997 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.173978 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:04:20.174055 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.174002 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:04:20.174055 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.174019 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lh6nm: references non-existent secret key: ca.crt Apr 16 14:04:20.174126 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.174092 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates podName:24327a8f-498a-4c99-990b-9970be56c7ba nodeName:}" failed. No retries permitted until 2026-04-16 14:04:20.674068807 +0000 UTC m=+316.718102005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates") pod "keda-operator-ffbb595cb-lh6nm" (UID: "24327a8f-498a-4c99-990b-9970be56c7ba") : references non-existent secret key: ca.crt Apr 16 14:04:20.180070 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.180043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66xw\" (UniqueName: \"kubernetes.io/projected/cec6f6b2-5658-413e-8b30-f2b16ef5f655-kube-api-access-f66xw\") pod \"keda-admission-cf49989db-t8xln\" (UID: \"cec6f6b2-5658-413e-8b30-f2b16ef5f655\") " pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.180156 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.180143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cec6f6b2-5658-413e-8b30-f2b16ef5f655-certificates\") pod \"keda-admission-cf49989db-t8xln\" (UID: \"cec6f6b2-5658-413e-8b30-f2b16ef5f655\") " pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.201533 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.201504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66xw\" (UniqueName: \"kubernetes.io/projected/cec6f6b2-5658-413e-8b30-f2b16ef5f655-kube-api-access-f66xw\") pod \"keda-admission-cf49989db-t8xln\" (UID: \"cec6f6b2-5658-413e-8b30-f2b16ef5f655\") " pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.473985 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.473938 2573 configmap.go:193] Couldn't get configMap openshift-keda/keda-ocp-cabundle: failed to sync configmap cache: timed out waiting for the condition Apr 16 14:04:20.474370 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.474032 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24327a8f-498a-4c99-990b-9970be56c7ba-cabundle0 podName:24327a8f-498a-4c99-990b-9970be56c7ba nodeName:}" failed. No retries permitted until 2026-04-16 14:04:20.974012704 +0000 UTC m=+317.018045879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cabundle0" (UniqueName: "kubernetes.io/configmap/24327a8f-498a-4c99-990b-9970be56c7ba-cabundle0") pod "keda-operator-ffbb595cb-lh6nm" (UID: "24327a8f-498a-4c99-990b-9970be56c7ba") : failed to sync configmap cache: timed out waiting for the condition Apr 16 14:04:20.541099 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.541066 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 14:04:20.546278 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.546256 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:04:20.546403 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.546333 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates podName:e27ea9c5-3675-4a92-a609-399c4f07260a nodeName:}" failed. No retries permitted until 2026-04-16 14:04:21.046312129 +0000 UTC m=+317.090345298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates") pod "keda-metrics-apiserver-7c9f485588-7g7wj" (UID: "e27ea9c5-3675-4a92-a609-399c4f07260a") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:04:20.552979 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.552954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cec6f6b2-5658-413e-8b30-f2b16ef5f655-certificates\") pod \"keda-admission-cf49989db-t8xln\" (UID: \"cec6f6b2-5658-413e-8b30-f2b16ef5f655\") " pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.684607 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.684550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:20.684835 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.684647 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:04:20.684835 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.684689 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:04:20.684835 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.684699 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lh6nm: references non-existent secret key: ca.crt Apr 16 14:04:20.684835 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:20.684751 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates podName:24327a8f-498a-4c99-990b-9970be56c7ba nodeName:}" failed. No retries permitted until 2026-04-16 14:04:21.684734802 +0000 UTC m=+317.728767977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates") pod "keda-operator-ffbb595cb-lh6nm" (UID: "24327a8f-498a-4c99-990b-9970be56c7ba") : references non-existent secret key: ca.crt Apr 16 14:04:20.800436 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.800409 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-qvlgf\"" Apr 16 14:04:20.806491 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.806472 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:20.935092 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.935057 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t8xln"] Apr 16 14:04:20.938901 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:04:20.938872 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec6f6b2_5658_413e_8b30_f2b16ef5f655.slice/crio-fcb0e2b513bcc1a1fe156883084c3b26bcea23c4ecd1363e621da6b3fe4802d3 WatchSource:0}: Error finding container fcb0e2b513bcc1a1fe156883084c3b26bcea23c4ecd1363e621da6b3fe4802d3: Status 404 returned error can't find the container with id fcb0e2b513bcc1a1fe156883084c3b26bcea23c4ecd1363e621da6b3fe4802d3 Apr 16 14:04:20.987174 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.987135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/24327a8f-498a-4c99-990b-9970be56c7ba-cabundle0\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:20.987782 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:20.987763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/24327a8f-498a-4c99-990b-9970be56c7ba-cabundle0\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:21.087589 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:21.087515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:21.087745 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.087713 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:04:21.087745 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.087738 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:04:21.087817 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.087761 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj: references non-existent secret key: tls.crt Apr 16 14:04:21.087851 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.087830 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates podName:e27ea9c5-3675-4a92-a609-399c4f07260a nodeName:}" failed. No retries permitted until 2026-04-16 14:04:22.087811434 +0000 UTC m=+318.131844602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates") pod "keda-metrics-apiserver-7c9f485588-7g7wj" (UID: "e27ea9c5-3675-4a92-a609-399c4f07260a") : references non-existent secret key: tls.crt Apr 16 14:04:21.608318 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:21.608280 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t8xln" event={"ID":"cec6f6b2-5658-413e-8b30-f2b16ef5f655","Type":"ContainerStarted","Data":"fcb0e2b513bcc1a1fe156883084c3b26bcea23c4ecd1363e621da6b3fe4802d3"} Apr 16 14:04:21.693461 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:21.693423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:21.693651 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.693576 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:04:21.693651 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.693596 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:04:21.693651 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.693608 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lh6nm: references non-existent secret key: ca.crt Apr 16 14:04:21.693845 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:04:21.693687 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates podName:24327a8f-498a-4c99-990b-9970be56c7ba nodeName:}" failed. No retries permitted until 2026-04-16 14:04:23.693659895 +0000 UTC m=+319.737693064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates") pod "keda-operator-ffbb595cb-lh6nm" (UID: "24327a8f-498a-4c99-990b-9970be56c7ba") : references non-existent secret key: ca.crt Apr 16 14:04:22.096581 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.096534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:22.099578 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.099551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e27ea9c5-3675-4a92-a609-399c4f07260a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7g7wj\" (UID: \"e27ea9c5-3675-4a92-a609-399c4f07260a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:22.316451 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.316415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:22.437951 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.437924 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj"] Apr 16 14:04:22.440067 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:04:22.440030 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode27ea9c5_3675_4a92_a609_399c4f07260a.slice/crio-6b5aba9b61f787b0b12cf4ac07582e6bea1f441bffe9f5750e2f9cd6d8cc1b79 WatchSource:0}: Error finding container 6b5aba9b61f787b0b12cf4ac07582e6bea1f441bffe9f5750e2f9cd6d8cc1b79: Status 404 returned error can't find the container with id 6b5aba9b61f787b0b12cf4ac07582e6bea1f441bffe9f5750e2f9cd6d8cc1b79 Apr 16 14:04:22.612953 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.612844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t8xln" event={"ID":"cec6f6b2-5658-413e-8b30-f2b16ef5f655","Type":"ContainerStarted","Data":"4f819285e89927b640ae76b54ce74f684e188e172838b76431cd6ea958649e6c"} Apr 16 14:04:22.612953 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.612892 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:22.613860 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.613837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" event={"ID":"e27ea9c5-3675-4a92-a609-399c4f07260a","Type":"ContainerStarted","Data":"6b5aba9b61f787b0b12cf4ac07582e6bea1f441bffe9f5750e2f9cd6d8cc1b79"} Apr 16 14:04:22.628227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:22.628170 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-t8xln" podStartSLOduration=2.355610408 podStartE2EDuration="3.628150774s" podCreationTimestamp="2026-04-16 14:04:19 +0000 UTC" firstStartedPulling="2026-04-16 14:04:20.940604388 +0000 UTC m=+316.984637566" lastFinishedPulling="2026-04-16 14:04:22.213144767 +0000 UTC m=+318.257177932" observedRunningTime="2026-04-16 14:04:22.627538087 +0000 UTC m=+318.671571273" watchObservedRunningTime="2026-04-16 14:04:22.628150774 +0000 UTC m=+318.672183963" Apr 16 14:04:23.709489 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:23.709433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:23.711921 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:23.711899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/24327a8f-498a-4c99-990b-9970be56c7ba-certificates\") pod \"keda-operator-ffbb595cb-lh6nm\" (UID: \"24327a8f-498a-4c99-990b-9970be56c7ba\") " pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:23.771008 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:23.770966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:23.891288 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:23.891253 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lh6nm"] Apr 16 14:04:23.894305 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:04:23.894275 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24327a8f_498a_4c99_990b_9970be56c7ba.slice/crio-50265bc468ffc5956c4be786817f639d98498bdc2f62cae5cee8b5f80ae88890 WatchSource:0}: Error finding container 50265bc468ffc5956c4be786817f639d98498bdc2f62cae5cee8b5f80ae88890: Status 404 returned error can't find the container with id 50265bc468ffc5956c4be786817f639d98498bdc2f62cae5cee8b5f80ae88890 Apr 16 14:04:24.621523 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:24.621484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" event={"ID":"24327a8f-498a-4c99-990b-9970be56c7ba","Type":"ContainerStarted","Data":"50265bc468ffc5956c4be786817f639d98498bdc2f62cae5cee8b5f80ae88890"} Apr 16 14:04:28.635720 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:28.635658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" event={"ID":"e27ea9c5-3675-4a92-a609-399c4f07260a","Type":"ContainerStarted","Data":"ef173ea785ddd711e77a05adf13e4fcf476835bc8776892d949195310fc649ba"} Apr 16 14:04:28.636169 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:28.635818 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:28.652873 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:28.652798 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" podStartSLOduration=3.578466727 podStartE2EDuration="9.652780394s" podCreationTimestamp="2026-04-16 14:04:19 +0000 UTC" firstStartedPulling="2026-04-16 14:04:22.441410109 +0000 UTC m=+318.485443272" lastFinishedPulling="2026-04-16 14:04:28.515723775 +0000 UTC m=+324.559756939" observedRunningTime="2026-04-16 14:04:28.650756816 +0000 UTC m=+324.694790004" watchObservedRunningTime="2026-04-16 14:04:28.652780394 +0000 UTC m=+324.696813571" Apr 16 14:04:30.644014 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:30.643977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" event={"ID":"24327a8f-498a-4c99-990b-9970be56c7ba","Type":"ContainerStarted","Data":"8c49cbcf7efafc15462712c32a625b0e2e60ef58a6f97578b559596bb400f4c5"} Apr 16 14:04:30.644486 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:30.644130 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:04:30.661796 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:30.661741 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" podStartSLOduration=5.774350092 podStartE2EDuration="11.661722576s" podCreationTimestamp="2026-04-16 14:04:19 +0000 UTC" firstStartedPulling="2026-04-16 14:04:23.895721925 +0000 UTC m=+319.939755093" lastFinishedPulling="2026-04-16 14:04:29.783094407 +0000 UTC m=+325.827127577" observedRunningTime="2026-04-16 14:04:30.660947756 +0000 UTC m=+326.704980954" watchObservedRunningTime="2026-04-16 14:04:30.661722576 +0000 UTC m=+326.705755763" Apr 16 14:04:39.645814 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:39.645785 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7g7wj" Apr 16 14:04:40.606407 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:40.606375 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jb82d" Apr 16 14:04:43.619509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:43.619471 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-t8xln" Apr 16 14:04:51.649363 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:04:51.649291 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lh6nm" Apr 16 14:05:26.065048 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.065012 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4"] Apr 16 14:05:26.068321 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.068303 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.071061 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.071036 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:05:26.071949 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.071932 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:05:26.071949 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.071945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-gptk4\"" Apr 16 14:05:26.072088 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.071946 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:05:26.077094 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.077075 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4"] Apr 16 14:05:26.080864 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.080840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ade766-cee1-48d9-b302-dfc852499e70-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qhjp4\" (UID: \"50ade766-cee1-48d9-b302-dfc852499e70\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.080961 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.080879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptk6f\" (UniqueName: \"kubernetes.io/projected/50ade766-cee1-48d9-b302-dfc852499e70-kube-api-access-ptk6f\") pod \"llmisvc-controller-manager-68cc5db7c4-qhjp4\" (UID: \"50ade766-cee1-48d9-b302-dfc852499e70\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.182033 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.181997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ade766-cee1-48d9-b302-dfc852499e70-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qhjp4\" (UID: \"50ade766-cee1-48d9-b302-dfc852499e70\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.182033 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.182051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptk6f\" (UniqueName: \"kubernetes.io/projected/50ade766-cee1-48d9-b302-dfc852499e70-kube-api-access-ptk6f\") pod \"llmisvc-controller-manager-68cc5db7c4-qhjp4\" (UID: \"50ade766-cee1-48d9-b302-dfc852499e70\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.184475 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.184453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50ade766-cee1-48d9-b302-dfc852499e70-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qhjp4\" (UID: \"50ade766-cee1-48d9-b302-dfc852499e70\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.197223 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.197191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptk6f\" (UniqueName: \"kubernetes.io/projected/50ade766-cee1-48d9-b302-dfc852499e70-kube-api-access-ptk6f\") pod \"llmisvc-controller-manager-68cc5db7c4-qhjp4\" (UID: \"50ade766-cee1-48d9-b302-dfc852499e70\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.380895 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.380800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:26.502190 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.502163 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4"] Apr 16 14:05:26.504489 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:05:26.504461 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod50ade766_cee1_48d9_b302_dfc852499e70.slice/crio-edf8476b53cc44b602a09dc7e6f1068f7a617ba248a083b1a205a647214232b4 WatchSource:0}: Error finding container edf8476b53cc44b602a09dc7e6f1068f7a617ba248a083b1a205a647214232b4: Status 404 returned error can't find the container with id edf8476b53cc44b602a09dc7e6f1068f7a617ba248a083b1a205a647214232b4 Apr 16 14:05:26.832729 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:26.832696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" event={"ID":"50ade766-cee1-48d9-b302-dfc852499e70","Type":"ContainerStarted","Data":"edf8476b53cc44b602a09dc7e6f1068f7a617ba248a083b1a205a647214232b4"} Apr 16 14:05:29.847516 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:29.847475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" event={"ID":"50ade766-cee1-48d9-b302-dfc852499e70","Type":"ContainerStarted","Data":"964944054d63634852a00987bd03a4ebf570932db0842e0538481851bb98decc"} Apr 16 14:05:29.847920 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:29.847546 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:05:29.862923 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:05:29.862869 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" podStartSLOduration=1.675109855 podStartE2EDuration="3.862850704s" podCreationTimestamp="2026-04-16 14:05:26 +0000 UTC" firstStartedPulling="2026-04-16 14:05:26.50582217 +0000 UTC m=+382.549855334" lastFinishedPulling="2026-04-16 14:05:28.693563016 +0000 UTC m=+384.737596183" observedRunningTime="2026-04-16 14:05:29.861897446 +0000 UTC m=+385.905930632" watchObservedRunningTime="2026-04-16 14:05:29.862850704 +0000 UTC m=+385.906883892" Apr 16 14:06:00.854293 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:00.854262 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qhjp4" Apr 16 14:06:36.037081 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.037009 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-w59zs"] Apr 16 14:06:36.039137 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.039120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.042103 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.042081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 14:06:36.042287 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.042268 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-bnd57\"" Apr 16 14:06:36.049951 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.049921 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-w59zs"] Apr 16 14:06:36.056101 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.056077 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-b8k5j"] Apr 16 14:06:36.058744 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.058727 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.063343 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.063326 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-4mdm2\"" Apr 16 14:06:36.063446 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.063333 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 14:06:36.067987 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.067966 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-b8k5j"] Apr 16 14:06:36.153187 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.153150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b4eb6-0704-4195-9dd1-fb662aa05b1e-cert\") pod \"odh-model-controller-696fc77849-b8k5j\" (UID: \"f18b4eb6-0704-4195-9dd1-fb662aa05b1e\") " pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.153187 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.153188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldc9\" (UniqueName: \"kubernetes.io/projected/f18b4eb6-0704-4195-9dd1-fb662aa05b1e-kube-api-access-vldc9\") pod \"odh-model-controller-696fc77849-b8k5j\" (UID: \"f18b4eb6-0704-4195-9dd1-fb662aa05b1e\") " pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.153407 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.153297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf2w7\" (UniqueName: \"kubernetes.io/projected/9bab7228-870d-492d-be90-dc4a62c8b740-kube-api-access-mf2w7\") pod \"model-serving-api-86f7b4b499-w59zs\" (UID: \"9bab7228-870d-492d-be90-dc4a62c8b740\") " pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.153407 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.153342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9bab7228-870d-492d-be90-dc4a62c8b740-tls-certs\") pod \"model-serving-api-86f7b4b499-w59zs\" (UID: \"9bab7228-870d-492d-be90-dc4a62c8b740\") " pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.254351 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.254312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf2w7\" (UniqueName: \"kubernetes.io/projected/9bab7228-870d-492d-be90-dc4a62c8b740-kube-api-access-mf2w7\") pod \"model-serving-api-86f7b4b499-w59zs\" (UID: \"9bab7228-870d-492d-be90-dc4a62c8b740\") " pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.254351 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.254355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9bab7228-870d-492d-be90-dc4a62c8b740-tls-certs\") pod \"model-serving-api-86f7b4b499-w59zs\" (UID: \"9bab7228-870d-492d-be90-dc4a62c8b740\") " pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.254606 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.254410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b4eb6-0704-4195-9dd1-fb662aa05b1e-cert\") pod \"odh-model-controller-696fc77849-b8k5j\" (UID: \"f18b4eb6-0704-4195-9dd1-fb662aa05b1e\") " pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.254606 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.254432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vldc9\" (UniqueName: \"kubernetes.io/projected/f18b4eb6-0704-4195-9dd1-fb662aa05b1e-kube-api-access-vldc9\") pod \"odh-model-controller-696fc77849-b8k5j\" (UID: \"f18b4eb6-0704-4195-9dd1-fb662aa05b1e\") " pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.256924 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.256895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9bab7228-870d-492d-be90-dc4a62c8b740-tls-certs\") pod \"model-serving-api-86f7b4b499-w59zs\" (UID: \"9bab7228-870d-492d-be90-dc4a62c8b740\") " pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.257030 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.256959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b4eb6-0704-4195-9dd1-fb662aa05b1e-cert\") pod \"odh-model-controller-696fc77849-b8k5j\" (UID: \"f18b4eb6-0704-4195-9dd1-fb662aa05b1e\") " pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.262227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.262199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf2w7\" (UniqueName: \"kubernetes.io/projected/9bab7228-870d-492d-be90-dc4a62c8b740-kube-api-access-mf2w7\") pod \"model-serving-api-86f7b4b499-w59zs\" (UID: \"9bab7228-870d-492d-be90-dc4a62c8b740\") " pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.262322 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.262265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldc9\" (UniqueName: \"kubernetes.io/projected/f18b4eb6-0704-4195-9dd1-fb662aa05b1e-kube-api-access-vldc9\") pod \"odh-model-controller-696fc77849-b8k5j\" (UID: \"f18b4eb6-0704-4195-9dd1-fb662aa05b1e\") " pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.350232 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.350203 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:36.372875 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.372843 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:36.485234 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.485196 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-w59zs"] Apr 16 14:06:36.488796 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:06:36.488770 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bab7228_870d_492d_be90_dc4a62c8b740.slice/crio-6e3ec7f1de01b29a7e3cd577dade7b2224e0a78f11b0a91417690e59fded2f75 WatchSource:0}: Error finding container 6e3ec7f1de01b29a7e3cd577dade7b2224e0a78f11b0a91417690e59fded2f75: Status 404 returned error can't find the container with id 6e3ec7f1de01b29a7e3cd577dade7b2224e0a78f11b0a91417690e59fded2f75 Apr 16 14:06:36.505148 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:36.505127 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-b8k5j"] Apr 16 14:06:36.507372 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:06:36.507346 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf18b4eb6_0704_4195_9dd1_fb662aa05b1e.slice/crio-01976bb498b882eeec0ecaae21fb52a9f0f22dc423405882bb4d8e5d7c4fe6fc WatchSource:0}: Error finding container 01976bb498b882eeec0ecaae21fb52a9f0f22dc423405882bb4d8e5d7c4fe6fc: Status 404 returned error can't find the container with id 01976bb498b882eeec0ecaae21fb52a9f0f22dc423405882bb4d8e5d7c4fe6fc Apr 16 14:06:37.070461 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:37.070411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-w59zs" event={"ID":"9bab7228-870d-492d-be90-dc4a62c8b740","Type":"ContainerStarted","Data":"6e3ec7f1de01b29a7e3cd577dade7b2224e0a78f11b0a91417690e59fded2f75"} Apr 16 14:06:37.071637 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:37.071608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-b8k5j" event={"ID":"f18b4eb6-0704-4195-9dd1-fb662aa05b1e","Type":"ContainerStarted","Data":"01976bb498b882eeec0ecaae21fb52a9f0f22dc423405882bb4d8e5d7c4fe6fc"} Apr 16 14:06:41.087742 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:41.087705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-b8k5j" event={"ID":"f18b4eb6-0704-4195-9dd1-fb662aa05b1e","Type":"ContainerStarted","Data":"45f7f7795241f40b8a2ab2a3b5d78d955e154da2a0b684bacfd884250ddf1ced"} Apr 16 14:06:41.088212 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:41.087932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:41.089086 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:41.089056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-w59zs" event={"ID":"9bab7228-870d-492d-be90-dc4a62c8b740","Type":"ContainerStarted","Data":"a75379c36c3298c63a2dabff0282a1f9d0e6455ba67bd78a9a905b46c13654a7"} Apr 16 14:06:41.089196 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:41.089187 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:06:41.104592 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:41.104548 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-b8k5j" podStartSLOduration=1.414720797 podStartE2EDuration="5.10453716s" podCreationTimestamp="2026-04-16 14:06:36 +0000 UTC" firstStartedPulling="2026-04-16 14:06:36.508618898 +0000 UTC m=+452.552652062" lastFinishedPulling="2026-04-16 14:06:40.198435257 +0000 UTC m=+456.242468425" observedRunningTime="2026-04-16 14:06:41.103204392 +0000 UTC m=+457.147237594" watchObservedRunningTime="2026-04-16 14:06:41.10453716 +0000 UTC m=+457.148570340" Apr 16 14:06:41.119414 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:41.119374 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-w59zs" podStartSLOduration=1.3729779930000001 podStartE2EDuration="5.119361019s" podCreationTimestamp="2026-04-16 14:06:36 +0000 UTC" firstStartedPulling="2026-04-16 14:06:36.490618883 +0000 UTC m=+452.534652047" lastFinishedPulling="2026-04-16 14:06:40.23700189 +0000 UTC m=+456.281035073" observedRunningTime="2026-04-16 14:06:41.118338822 +0000 UTC m=+457.162371998" watchObservedRunningTime="2026-04-16 14:06:41.119361019 +0000 UTC m=+457.163394204" Apr 16 14:06:52.094417 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:52.094378 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-b8k5j" Apr 16 14:06:52.096559 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:06:52.096540 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-w59zs" Apr 16 14:07:12.652785 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.652748 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc"] Apr 16 14:07:12.671949 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.671922 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc"] Apr 16 14:07:12.672116 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.672059 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" Apr 16 14:07:12.674818 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.674796 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4rnlz\"" Apr 16 14:07:12.687542 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.687018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" Apr 16 14:07:12.789711 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.789658 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc"] Apr 16 14:07:12.798792 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.798465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" Apr 16 14:07:12.802026 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.801982 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc"] Apr 16 14:07:12.814318 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.813887 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" Apr 16 14:07:12.861732 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.860428 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc"] Apr 16 14:07:12.867150 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:07:12.867106 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f483f6_5c0d_45ea_9cdf_e2dc242babd6.slice/crio-06dad40ec2de9fb712158930a1e8761b46e33e681f99ab0a33bbe95b717f60a8 WatchSource:0}: Error finding container 06dad40ec2de9fb712158930a1e8761b46e33e681f99ab0a33bbe95b717f60a8: Status 404 returned error can't find the container with id 06dad40ec2de9fb712158930a1e8761b46e33e681f99ab0a33bbe95b717f60a8 Apr 16 14:07:12.985609 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:12.985575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc"] Apr 16 14:07:12.989507 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:07:12.989476 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc47cb4fc_6838_4a29_9393_88b7464c421d.slice/crio-64629a3a7c533f24ea504eeff264b9488ee089241e4ce1965114f64bf3cd4af2 WatchSource:0}: Error finding container 64629a3a7c533f24ea504eeff264b9488ee089241e4ce1965114f64bf3cd4af2: Status 404 returned error can't find the container with id 64629a3a7c533f24ea504eeff264b9488ee089241e4ce1965114f64bf3cd4af2 Apr 16 14:07:13.199545 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:13.199462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" event={"ID":"c47cb4fc-6838-4a29-9393-88b7464c421d","Type":"ContainerStarted","Data":"64629a3a7c533f24ea504eeff264b9488ee089241e4ce1965114f64bf3cd4af2"} Apr 16 14:07:13.200578 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:13.200552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" event={"ID":"76f483f6-5c0d-45ea-9cdf-e2dc242babd6","Type":"ContainerStarted","Data":"06dad40ec2de9fb712158930a1e8761b46e33e681f99ab0a33bbe95b717f60a8"} Apr 16 14:07:29.277872 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.277766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" event={"ID":"c47cb4fc-6838-4a29-9393-88b7464c421d","Type":"ContainerStarted","Data":"81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907"} Apr 16 14:07:29.278347 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.277982 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" Apr 16 14:07:29.279463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.279429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" event={"ID":"76f483f6-5c0d-45ea-9cdf-e2dc242babd6","Type":"ContainerStarted","Data":"a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc"} Apr 16 14:07:29.279590 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.279471 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" Apr 16 14:07:29.279590 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.279545 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:07:29.280544 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.280521 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:07:29.293261 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.293218 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podStartSLOduration=1.42788092 podStartE2EDuration="17.293206425s" podCreationTimestamp="2026-04-16 14:07:12 +0000 UTC" firstStartedPulling="2026-04-16 14:07:12.99145215 +0000 UTC m=+489.035485317" lastFinishedPulling="2026-04-16 14:07:28.856777658 +0000 UTC m=+504.900810822" observedRunningTime="2026-04-16 14:07:29.29144224 +0000 UTC m=+505.335475427" watchObservedRunningTime="2026-04-16 14:07:29.293206425 +0000 UTC m=+505.337239612" Apr 16 14:07:29.304569 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:29.304526 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podStartSLOduration=1.315021227 podStartE2EDuration="17.304510429s" podCreationTimestamp="2026-04-16 14:07:12 +0000 UTC" firstStartedPulling="2026-04-16 14:07:12.871451986 +0000 UTC m=+488.915485164" lastFinishedPulling="2026-04-16 14:07:28.860941202 +0000 UTC m=+504.904974366" observedRunningTime="2026-04-16 14:07:29.304284749 +0000 UTC m=+505.348317946" watchObservedRunningTime="2026-04-16 14:07:29.304510429 +0000 UTC m=+505.348543617" Apr 16 14:07:30.283519 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:30.283480 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:07:30.283926 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:30.283480 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:07:40.284558 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:40.284517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:07:40.285034 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:40.284520 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:07:50.284651 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:50.284555 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:07:50.284651 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:07:50.284555 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:08:00.283761 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:00.283720 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:08:00.284208 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:00.283723 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:08:10.283924 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:10.283875 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:08:10.284322 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:10.283875 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:08:20.284868 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:20.284828 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" Apr 16 14:08:20.285327 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:20.284899 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" Apr 16 14:08:46.861627 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.861587 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc"] Apr 16 14:08:46.862143 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.861881 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" containerID="cri-o://a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc" gracePeriod=30 Apr 16 14:08:46.902283 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.902251 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp"] Apr 16 14:08:46.917273 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.917235 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp"] Apr 16 14:08:46.917483 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.917376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" Apr 16 14:08:46.931861 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.931832 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc"] Apr 16 14:08:46.932343 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.932292 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" containerID="cri-o://81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907" gracePeriod=30 Apr 16 14:08:46.933221 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.933202 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" Apr 16 14:08:46.956690 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.956643 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw"] Apr 16 14:08:46.961326 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.961305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" Apr 16 14:08:46.967805 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.967782 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw"] Apr 16 14:08:46.975113 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:46.975092 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" Apr 16 14:08:47.092371 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.092302 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp"] Apr 16 14:08:47.097133 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:08:47.096527 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2920c16_df91_489f_b0a0_e329163ed7a2.slice/crio-73f15d68fdf326733ba805806cb80ec2a9526ff97c506965961c05f3fb95f5e1 WatchSource:0}: Error finding container 73f15d68fdf326733ba805806cb80ec2a9526ff97c506965961c05f3fb95f5e1: Status 404 returned error can't find the container with id 73f15d68fdf326733ba805806cb80ec2a9526ff97c506965961c05f3fb95f5e1 Apr 16 14:08:47.137453 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.137408 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw"] Apr 16 14:08:47.142264 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:08:47.142220 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda18206ee_fa26_4187_b14d_488a71898d3e.slice/crio-5e6d0ec4f0789c92189cd8da5679b062d045deeb8bef7d9c3e5eebcdf0226a6c WatchSource:0}: Error finding container 5e6d0ec4f0789c92189cd8da5679b062d045deeb8bef7d9c3e5eebcdf0226a6c: Status 404 returned error can't find the container with id 5e6d0ec4f0789c92189cd8da5679b062d045deeb8bef7d9c3e5eebcdf0226a6c Apr 16 14:08:47.543984 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.543898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" event={"ID":"a18206ee-fa26-4187-b14d-488a71898d3e","Type":"ContainerStarted","Data":"ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2"} Apr 16 14:08:47.543984 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.543936 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" event={"ID":"a18206ee-fa26-4187-b14d-488a71898d3e","Type":"ContainerStarted","Data":"5e6d0ec4f0789c92189cd8da5679b062d045deeb8bef7d9c3e5eebcdf0226a6c"} Apr 16 14:08:47.544210 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.544193 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" Apr 16 14:08:47.545377 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.545346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" event={"ID":"d2920c16-df91-489f-b0a0-e329163ed7a2","Type":"ContainerStarted","Data":"8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d"} Apr 16 14:08:47.545493 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.545383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" event={"ID":"d2920c16-df91-489f-b0a0-e329163ed7a2","Type":"ContainerStarted","Data":"73f15d68fdf326733ba805806cb80ec2a9526ff97c506965961c05f3fb95f5e1"} Apr 16 14:08:47.545493 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.545391 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:08:47.545571 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.545506 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" Apr 16 14:08:47.546515 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.546493 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:08:47.558267 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.558230 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podStartSLOduration=1.558218587 podStartE2EDuration="1.558218587s" podCreationTimestamp="2026-04-16 14:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:08:47.557423003 +0000 UTC m=+583.601456187" watchObservedRunningTime="2026-04-16 14:08:47.558218587 +0000 UTC m=+583.602251773" Apr 16 14:08:47.573222 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:47.573182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podStartSLOduration=1.573168876 podStartE2EDuration="1.573168876s" podCreationTimestamp="2026-04-16 14:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:08:47.572062513 +0000 UTC m=+583.616095700" watchObservedRunningTime="2026-04-16 14:08:47.573168876 +0000 UTC m=+583.617202063" Apr 16 14:08:48.549402 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:48.549357 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:08:48.549814 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:48.549483 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:08:50.284510 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:50.284467 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 14:08:50.284931 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:50.284467 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 14:08:51.098952 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.098929 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" Apr 16 14:08:51.102244 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.102226 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" Apr 16 14:08:51.564019 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.563986 2573 generic.go:358] "Generic (PLEG): container finished" podID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerID="a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc" exitCode=0 Apr 16 14:08:51.564459 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.564050 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" Apr 16 14:08:51.564459 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.564070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" event={"ID":"76f483f6-5c0d-45ea-9cdf-e2dc242babd6","Type":"ContainerDied","Data":"a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc"} Apr 16 14:08:51.564459 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.564108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc" event={"ID":"76f483f6-5c0d-45ea-9cdf-e2dc242babd6","Type":"ContainerDied","Data":"06dad40ec2de9fb712158930a1e8761b46e33e681f99ab0a33bbe95b717f60a8"} Apr 16 14:08:51.564459 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.564129 2573 scope.go:117] "RemoveContainer" containerID="a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc" Apr 16 14:08:51.565272 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.565254 2573 generic.go:358] "Generic (PLEG): container finished" podID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerID="81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907" exitCode=0 Apr 16 14:08:51.565375 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.565282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" event={"ID":"c47cb4fc-6838-4a29-9393-88b7464c421d","Type":"ContainerDied","Data":"81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907"} Apr 16 14:08:51.565375 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.565299 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" event={"ID":"c47cb4fc-6838-4a29-9393-88b7464c421d","Type":"ContainerDied","Data":"64629a3a7c533f24ea504eeff264b9488ee089241e4ce1965114f64bf3cd4af2"} Apr 16 14:08:51.565375 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.565305 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc" Apr 16 14:08:51.572890 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.572868 2573 scope.go:117] "RemoveContainer" containerID="a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc" Apr 16 14:08:51.573170 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:08:51.573152 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc\": container with ID starting with a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc not found: ID does not exist" containerID="a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc" Apr 16 14:08:51.573238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.573187 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc"} err="failed to get container status \"a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc\": rpc error: code = NotFound desc = could not find container \"a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc\": container with ID starting with a5d74134bea12e2b538c3206a4e3cc0c45e7d3c1ff575ace6013a18580d7c6cc not found: ID does not exist" Apr 16 14:08:51.573238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.573212 2573 scope.go:117] "RemoveContainer" containerID="81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907" Apr 16 14:08:51.580547 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.580530 2573 scope.go:117] "RemoveContainer" containerID="81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907" Apr 16 14:08:51.580842 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:08:51.580822 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907\": container with ID starting with 81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907 not found: ID does not exist" containerID="81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907" Apr 16 14:08:51.580891 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.580849 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907"} err="failed to get container status \"81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907\": rpc error: code = NotFound desc = could not find container \"81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907\": container with ID starting with 81a0b1de33cd95fc5153bd4ef516122ded4edac4b54b5bd9a88e780482f1c907 not found: ID does not exist" Apr 16 14:08:51.588143 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.588124 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc"] Apr 16 14:08:51.589712 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.589691 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b65bd-predictor-7b84f8cc4-bgvmc"] Apr 16 14:08:51.599394 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.599374 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc"] Apr 16 14:08:51.604634 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:51.604613 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b65bd-predictor-75967c8fc9-kkvkc"] Apr 16 14:08:52.517201 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:52.517166 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" path="/var/lib/kubelet/pods/76f483f6-5c0d-45ea-9cdf-e2dc242babd6/volumes" Apr 16 14:08:52.517420 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:52.517408 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" path="/var/lib/kubelet/pods/c47cb4fc-6838-4a29-9393-88b7464c421d/volumes" Apr 16 14:08:58.549548 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:58.549510 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:08:58.549947 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:08:58.549510 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:09:04.459354 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:04.459323 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:09:04.459808 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:04.459524 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:09:04.462976 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:04.462942 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:09:04.468272 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:04.468248 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:09:08.550402 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:08.550365 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:09:08.550789 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:08.550365 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:09:18.550067 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:18.549979 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:09:18.552647 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:18.549979 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:09:22.744633 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.744596 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2"] Apr 16 14:09:22.745023 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.744975 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" Apr 16 14:09:22.745023 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.744990 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" Apr 16 14:09:22.745023 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.745007 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" Apr 16 14:09:22.745023 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.745013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" Apr 16 14:09:22.745150 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.745071 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="76f483f6-5c0d-45ea-9cdf-e2dc242babd6" containerName="kserve-container" Apr 16 14:09:22.745150 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.745084 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c47cb4fc-6838-4a29-9393-88b7464c421d" containerName="kserve-container" Apr 16 14:09:22.748149 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.748130 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" Apr 16 14:09:22.755597 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.755557 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2"] Apr 16 14:09:22.761036 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.761012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" Apr 16 14:09:22.841858 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.841788 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm"] Apr 16 14:09:22.846058 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.845813 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" Apr 16 14:09:22.848257 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.848228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm"] Apr 16 14:09:22.859558 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.859536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" Apr 16 14:09:22.911771 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.911729 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2"] Apr 16 14:09:22.914319 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:09:22.914287 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705a79d2_76f4_48f3_8bdf_345154bcccc0.slice/crio-ebd6d772f149d7a11449dbe3473188a00028d3d7de07ac3ae40d41504ae09abc WatchSource:0}: Error finding container ebd6d772f149d7a11449dbe3473188a00028d3d7de07ac3ae40d41504ae09abc: Status 404 returned error can't find the container with id ebd6d772f149d7a11449dbe3473188a00028d3d7de07ac3ae40d41504ae09abc Apr 16 14:09:22.916241 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:22.916224 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:09:23.007544 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.007517 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm"] Apr 16 14:09:23.009589 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:09:23.009543 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821f0e3c_d3c4_4660_ac89_ce39c2ca474c.slice/crio-59d7769441e99db1626d5e740d682d364df6c5a4bdc6bb6ae375dbeecb152c3d WatchSource:0}: Error finding container 59d7769441e99db1626d5e740d682d364df6c5a4bdc6bb6ae375dbeecb152c3d: Status 404 returned error can't find the container with id 59d7769441e99db1626d5e740d682d364df6c5a4bdc6bb6ae375dbeecb152c3d Apr 16 14:09:23.671858 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.671820 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" event={"ID":"705a79d2-76f4-48f3-8bdf-345154bcccc0","Type":"ContainerStarted","Data":"47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f"} Apr 16 14:09:23.671858 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.671864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" event={"ID":"705a79d2-76f4-48f3-8bdf-345154bcccc0","Type":"ContainerStarted","Data":"ebd6d772f149d7a11449dbe3473188a00028d3d7de07ac3ae40d41504ae09abc"} Apr 16 14:09:23.672122 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.671977 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" Apr 16 14:09:23.673111 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.673085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" event={"ID":"821f0e3c-d3c4-4660-ac89-ce39c2ca474c","Type":"ContainerStarted","Data":"e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117"} Apr 16 14:09:23.673204 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.673114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" event={"ID":"821f0e3c-d3c4-4660-ac89-ce39c2ca474c","Type":"ContainerStarted","Data":"59d7769441e99db1626d5e740d682d364df6c5a4bdc6bb6ae375dbeecb152c3d"} Apr 16 14:09:23.673256 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.673232 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 14:09:23.673349 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.673330 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" Apr 16 14:09:23.674283 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.674260 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 14:09:23.687300 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.687259 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podStartSLOduration=1.68724658 podStartE2EDuration="1.68724658s" podCreationTimestamp="2026-04-16 14:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:23.685412233 +0000 UTC m=+619.729445420" watchObservedRunningTime="2026-04-16 14:09:23.68724658 +0000 UTC m=+619.731279766" Apr 16 14:09:23.698629 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:23.698583 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podStartSLOduration=1.698570812 podStartE2EDuration="1.698570812s" podCreationTimestamp="2026-04-16 14:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:23.697881587 +0000 UTC m=+619.741914774" watchObservedRunningTime="2026-04-16 14:09:23.698570812 +0000 UTC m=+619.742604047" Apr 16 14:09:24.677351 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:24.677306 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 14:09:24.677816 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:24.677435 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 14:09:28.550252 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:28.550205 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:09:28.550741 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:28.550216 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:09:34.677774 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:34.677728 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 14:09:34.678150 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:34.677728 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 14:09:38.551381 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:38.551349 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" Apr 16 14:09:38.551770 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:38.551403 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" Apr 16 14:09:44.678396 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:44.678349 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 14:09:44.678800 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:44.678349 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 14:09:54.677810 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:54.677765 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 14:09:54.678285 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:09:54.677765 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 14:10:04.677747 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:10:04.677703 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 14:10:04.678124 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:10:04.677704 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 14:10:14.678357 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:10:14.678328 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" Apr 16 14:10:14.678804 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:10:14.678380 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" Apr 16 14:14:04.488509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:14:04.488430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:14:04.490280 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:14:04.490258 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:14:04.491770 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:14:04.491751 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:14:04.493371 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:14:04.493353 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:18:11.853720 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.853682 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp"] Apr 16 14:18:11.854274 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.853992 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" containerID="cri-o://8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d" gracePeriod=30 Apr 16 14:18:11.897167 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.897129 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw"] Apr 16 14:18:11.897396 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.897370 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" containerID="cri-o://ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2" gracePeriod=30 Apr 16 14:18:11.940570 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.940532 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn"] Apr 16 14:18:11.944509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.944485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" Apr 16 14:18:11.954884 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.954864 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" Apr 16 14:18:11.965003 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.964968 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn"] Apr 16 14:18:11.972208 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.972176 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7"] Apr 16 14:18:11.976538 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.976515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" Apr 16 14:18:11.987190 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:11.987162 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7"] Apr 16 14:18:12.012396 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.012362 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" Apr 16 14:18:12.149830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.148694 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn"] Apr 16 14:18:12.149830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.148797 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:18:12.207209 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.207181 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7"] Apr 16 14:18:12.209961 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:18:12.209929 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f4e5e05_a06c_4a75_b369_f49db00cd63d.slice/crio-fa5a6271bf31d7bb08810a9d3adaab82db5f3bc9756ab012b0c5acb7b987b7fa WatchSource:0}: Error finding container fa5a6271bf31d7bb08810a9d3adaab82db5f3bc9756ab012b0c5acb7b987b7fa: Status 404 returned error can't find the container with id fa5a6271bf31d7bb08810a9d3adaab82db5f3bc9756ab012b0c5acb7b987b7fa Apr 16 14:18:12.460253 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.460153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" event={"ID":"0f4e5e05-a06c-4a75-b369-f49db00cd63d","Type":"ContainerStarted","Data":"db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461"} Apr 16 14:18:12.460253 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.460205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" event={"ID":"0f4e5e05-a06c-4a75-b369-f49db00cd63d","Type":"ContainerStarted","Data":"fa5a6271bf31d7bb08810a9d3adaab82db5f3bc9756ab012b0c5acb7b987b7fa"} Apr 16 14:18:12.460520 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.460489 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" Apr 16 14:18:12.461623 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.461598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" event={"ID":"5c0b3289-80af-4bbe-a292-2c620711f4c5","Type":"ContainerStarted","Data":"d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b"} Apr 16 14:18:12.461623 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.461626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" event={"ID":"5c0b3289-80af-4bbe-a292-2c620711f4c5","Type":"ContainerStarted","Data":"439155c3115617060df8f80c642077a037fb36c9900041334e6941e2866695fc"} Apr 16 14:18:12.461830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.461814 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" Apr 16 14:18:12.462190 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.462164 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:18:12.462853 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.462833 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:18:12.475504 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.475466 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podStartSLOduration=1.475454435 podStartE2EDuration="1.475454435s" podCreationTimestamp="2026-04-16 14:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:12.474213762 +0000 UTC m=+1148.518246951" watchObservedRunningTime="2026-04-16 14:18:12.475454435 +0000 UTC m=+1148.519487620" Apr 16 14:18:12.488443 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:12.488399 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podStartSLOduration=1.488386814 podStartE2EDuration="1.488386814s" podCreationTimestamp="2026-04-16 14:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:12.487267488 +0000 UTC m=+1148.531300673" watchObservedRunningTime="2026-04-16 14:18:12.488386814 +0000 UTC m=+1148.532420000" Apr 16 14:18:13.465783 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:13.465741 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:18:13.466230 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:13.465743 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:18:15.919469 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:15.919446 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" Apr 16 14:18:16.039404 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.039374 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" Apr 16 14:18:16.481523 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.481486 2573 generic.go:358] "Generic (PLEG): container finished" podID="a18206ee-fa26-4187-b14d-488a71898d3e" containerID="ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2" exitCode=0 Apr 16 14:18:16.481713 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.481553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" event={"ID":"a18206ee-fa26-4187-b14d-488a71898d3e","Type":"ContainerDied","Data":"ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2"} Apr 16 14:18:16.481713 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.481582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" event={"ID":"a18206ee-fa26-4187-b14d-488a71898d3e","Type":"ContainerDied","Data":"5e6d0ec4f0789c92189cd8da5679b062d045deeb8bef7d9c3e5eebcdf0226a6c"} Apr 16 14:18:16.481713 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.481555 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw" Apr 16 14:18:16.481713 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.481604 2573 scope.go:117] "RemoveContainer" containerID="ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2" Apr 16 14:18:16.482627 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.482604 2573 generic.go:358] "Generic (PLEG): container finished" podID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerID="8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d" exitCode=0 Apr 16 14:18:16.482711 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.482659 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" Apr 16 14:18:16.482775 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.482658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" event={"ID":"d2920c16-df91-489f-b0a0-e329163ed7a2","Type":"ContainerDied","Data":"8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d"} Apr 16 14:18:16.482830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.482776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp" event={"ID":"d2920c16-df91-489f-b0a0-e329163ed7a2","Type":"ContainerDied","Data":"73f15d68fdf326733ba805806cb80ec2a9526ff97c506965961c05f3fb95f5e1"} Apr 16 14:18:16.490152 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.490130 2573 scope.go:117] "RemoveContainer" containerID="ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2" Apr 16 14:18:16.490448 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:18:16.490421 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2\": container with ID starting with ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2 not found: ID does not exist" containerID="ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2" Apr 16 14:18:16.490536 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.490460 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2"} err="failed to get container status \"ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2\": rpc error: code = NotFound desc = could not find container \"ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2\": container with ID starting with ac2052008bab8431dc882d0c2c69617c975acaa3905d86bbefacbda80fdd13f2 not found: ID does not exist" Apr 16 14:18:16.490536 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.490483 2573 scope.go:117] "RemoveContainer" containerID="8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d" Apr 16 14:18:16.497807 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.497791 2573 scope.go:117] "RemoveContainer" containerID="8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d" Apr 16 14:18:16.498050 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:18:16.498031 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d\": container with ID starting with 8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d not found: ID does not exist" containerID="8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d" Apr 16 14:18:16.498105 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.498061 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d"} err="failed to get container status \"8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d\": rpc error: code = NotFound desc = could not find container \"8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d\": container with ID starting with 8c250ad04d0af3b38a469e9fb6eb89f150e6937d3109daf255bf07da400cd68d not found: ID does not exist" Apr 16 14:18:16.506073 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.506052 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp"] Apr 16 14:18:16.507725 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.507705 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7cd29-predictor-bc98c47fd-bdwpp"] Apr 16 14:18:16.517078 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.517057 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" path="/var/lib/kubelet/pods/d2920c16-df91-489f-b0a0-e329163ed7a2/volumes" Apr 16 14:18:16.519947 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.519926 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw"] Apr 16 14:18:16.524281 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:16.524251 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7cd29-predictor-d9cbb6df9-mssdw"] Apr 16 14:18:18.517693 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:18.517601 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" path="/var/lib/kubelet/pods/a18206ee-fa26-4187-b14d-488a71898d3e/volumes" Apr 16 14:18:23.466661 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:23.466621 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:18:23.467081 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:23.466620 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:18:33.466752 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:33.466706 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:18:33.467132 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:33.466706 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:18:43.466595 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:43.466549 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:18:43.467005 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:43.466549 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:18:47.641905 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.641862 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2"] Apr 16 14:18:47.642336 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.642162 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" containerID="cri-o://47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f" gracePeriod=30 Apr 16 14:18:47.671224 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671193 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4"] Apr 16 14:18:47.671618 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671599 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" Apr 16 14:18:47.671618 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671617 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" Apr 16 14:18:47.671797 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671636 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" Apr 16 14:18:47.671797 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671642 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" Apr 16 14:18:47.671797 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671729 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2920c16-df91-489f-b0a0-e329163ed7a2" containerName="kserve-container" Apr 16 14:18:47.671797 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.671738 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a18206ee-fa26-4187-b14d-488a71898d3e" containerName="kserve-container" Apr 16 14:18:47.674907 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.674891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" Apr 16 14:18:47.681901 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.681876 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4"] Apr 16 14:18:47.687203 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.687187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" Apr 16 14:18:47.714642 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.714610 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm"] Apr 16 14:18:47.714929 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.714888 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" containerID="cri-o://e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117" gracePeriod=30 Apr 16 14:18:47.756704 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.756651 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft"] Apr 16 14:18:47.761977 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.761948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" Apr 16 14:18:47.768445 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.768401 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft"] Apr 16 14:18:47.780196 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.780155 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" Apr 16 14:18:47.846761 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.846595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4"] Apr 16 14:18:47.849440 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:18:47.849397 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f8aa8c_051a_4212_9729_ac903b1beae6.slice/crio-363d6916107ba01a380b321e482e50d2f6ac137a6fca095123bfdf3b5209be98 WatchSource:0}: Error finding container 363d6916107ba01a380b321e482e50d2f6ac137a6fca095123bfdf3b5209be98: Status 404 returned error can't find the container with id 363d6916107ba01a380b321e482e50d2f6ac137a6fca095123bfdf3b5209be98 Apr 16 14:18:47.931375 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:47.931346 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft"] Apr 16 14:18:47.946141 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:18:47.946102 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f4d5611_1b45_4ad6_b88d_9467d30c8490.slice/crio-3ef774ab207e5d14982fc9410a0b1bdc8b156fe1fefd74773122c1809f266227 WatchSource:0}: Error finding container 3ef774ab207e5d14982fc9410a0b1bdc8b156fe1fefd74773122c1809f266227: Status 404 returned error can't find the container with id 3ef774ab207e5d14982fc9410a0b1bdc8b156fe1fefd74773122c1809f266227 Apr 16 14:18:48.590543 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.590501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" event={"ID":"1f4d5611-1b45-4ad6-b88d-9467d30c8490","Type":"ContainerStarted","Data":"5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881"} Apr 16 14:18:48.590543 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.590545 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" event={"ID":"1f4d5611-1b45-4ad6-b88d-9467d30c8490","Type":"ContainerStarted","Data":"3ef774ab207e5d14982fc9410a0b1bdc8b156fe1fefd74773122c1809f266227"} Apr 16 14:18:48.590845 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.590564 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" Apr 16 14:18:48.591909 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.591885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" event={"ID":"80f8aa8c-051a-4212-9729-ac903b1beae6","Type":"ContainerStarted","Data":"b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c"} Apr 16 14:18:48.592004 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.591916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" event={"ID":"80f8aa8c-051a-4212-9729-ac903b1beae6","Type":"ContainerStarted","Data":"363d6916107ba01a380b321e482e50d2f6ac137a6fca095123bfdf3b5209be98"} Apr 16 14:18:48.592113 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.592084 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:18:48.592159 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.592100 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" Apr 16 14:18:48.593030 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.593007 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:18:48.606880 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.606843 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podStartSLOduration=1.606831994 podStartE2EDuration="1.606831994s" podCreationTimestamp="2026-04-16 14:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:48.605205773 +0000 UTC m=+1184.649238961" watchObservedRunningTime="2026-04-16 14:18:48.606831994 +0000 UTC m=+1184.650865228" Apr 16 14:18:48.619119 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:48.619076 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podStartSLOduration=1.619062795 podStartE2EDuration="1.619062795s" podCreationTimestamp="2026-04-16 14:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:48.618024237 +0000 UTC m=+1184.662057436" watchObservedRunningTime="2026-04-16 14:18:48.619062795 +0000 UTC m=+1184.663095983" Apr 16 14:18:49.595050 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:49.595008 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:18:49.595501 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:49.595018 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:18:50.992297 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:50.992267 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" Apr 16 14:18:51.601984 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.601949 2573 generic.go:358] "Generic (PLEG): container finished" podID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerID="47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f" exitCode=0 Apr 16 14:18:51.602155 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.602010 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" Apr 16 14:18:51.602155 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.602033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" event={"ID":"705a79d2-76f4-48f3-8bdf-345154bcccc0","Type":"ContainerDied","Data":"47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f"} Apr 16 14:18:51.602155 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.602070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2" event={"ID":"705a79d2-76f4-48f3-8bdf-345154bcccc0","Type":"ContainerDied","Data":"ebd6d772f149d7a11449dbe3473188a00028d3d7de07ac3ae40d41504ae09abc"} Apr 16 14:18:51.602155 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.602088 2573 scope.go:117] "RemoveContainer" containerID="47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f" Apr 16 14:18:51.610441 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.610426 2573 scope.go:117] "RemoveContainer" containerID="47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f" Apr 16 14:18:51.610712 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:18:51.610687 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f\": container with ID starting with 47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f not found: ID does not exist" containerID="47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f" Apr 16 14:18:51.610806 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.610717 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f"} err="failed to get container status \"47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f\": rpc error: code = NotFound desc = could not find container \"47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f\": container with ID starting with 47893882a626b95f8d9bb06ccc9454d16ec586f918cb479891978e58103cce5f not found: ID does not exist" Apr 16 14:18:51.622504 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.622482 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2"] Apr 16 14:18:51.623994 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.623975 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3604c-predictor-67675654b4-qjrd2"] Apr 16 14:18:51.978913 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:51.978891 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" Apr 16 14:18:52.516518 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.516484 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" path="/var/lib/kubelet/pods/705a79d2-76f4-48f3-8bdf-345154bcccc0/volumes" Apr 16 14:18:52.607452 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.607417 2573 generic.go:358] "Generic (PLEG): container finished" podID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerID="e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117" exitCode=0 Apr 16 14:18:52.607635 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.607480 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" Apr 16 14:18:52.607635 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.607479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" event={"ID":"821f0e3c-d3c4-4660-ac89-ce39c2ca474c","Type":"ContainerDied","Data":"e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117"} Apr 16 14:18:52.607635 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.607587 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm" event={"ID":"821f0e3c-d3c4-4660-ac89-ce39c2ca474c","Type":"ContainerDied","Data":"59d7769441e99db1626d5e740d682d364df6c5a4bdc6bb6ae375dbeecb152c3d"} Apr 16 14:18:52.607635 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.607608 2573 scope.go:117] "RemoveContainer" containerID="e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117" Apr 16 14:18:52.615495 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.615479 2573 scope.go:117] "RemoveContainer" containerID="e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117" Apr 16 14:18:52.615991 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:18:52.615973 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117\": container with ID starting with e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117 not found: ID does not exist" containerID="e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117" Apr 16 14:18:52.616044 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.615999 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117"} err="failed to get container status \"e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117\": rpc error: code = NotFound desc = could not find container \"e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117\": container with ID starting with e50fcbc3faa882bbdfab8421555e9a464fdfea07c8925d4bc1e9c01935355117 not found: ID does not exist" Apr 16 14:18:52.623434 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.623400 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm"] Apr 16 14:18:52.626539 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:52.626519 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3604c-predictor-6c9f6657bf-j9tfm"] Apr 16 14:18:53.466014 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:53.465968 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:18:53.466194 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:53.465968 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:18:54.517933 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:54.517902 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" path="/var/lib/kubelet/pods/821f0e3c-d3c4-4660-ac89-ce39c2ca474c/volumes" Apr 16 14:18:59.595534 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:59.595488 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:18:59.595973 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:18:59.595491 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:19:03.466643 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:03.466605 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" Apr 16 14:19:03.467127 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:03.466942 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" Apr 16 14:19:04.513358 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:04.513329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:19:04.520070 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:04.520048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:19:04.520480 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:04.520464 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:19:04.523090 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:04.523071 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:19:09.595785 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:09.595729 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:19:09.596238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:09.595731 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:19:19.595547 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:19.595504 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:19:19.595984 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:19.595504 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:19:29.595824 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:29.595780 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:19:29.596282 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:29.595780 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:19:32.151819 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.151782 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn"] Apr 16 14:19:32.152258 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.152074 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" containerID="cri-o://d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b" gracePeriod=30 Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186049 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr"] Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186433 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186444 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186458 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186465 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186531 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="705a79d2-76f4-48f3-8bdf-345154bcccc0" containerName="kserve-container" Apr 16 14:19:32.186737 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.186538 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="821f0e3c-d3c4-4660-ac89-ce39c2ca474c" containerName="kserve-container" Apr 16 14:19:32.189780 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.189743 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" Apr 16 14:19:32.197700 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.197656 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr"] Apr 16 14:19:32.204806 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.204783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" Apr 16 14:19:32.230054 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.230022 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7"] Apr 16 14:19:32.230541 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.230507 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" containerID="cri-o://db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461" gracePeriod=30 Apr 16 14:19:32.260971 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.260936 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx"] Apr 16 14:19:32.267385 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.266335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" Apr 16 14:19:32.274017 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.272578 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx"] Apr 16 14:19:32.288971 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.288554 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" Apr 16 14:19:32.375705 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.375638 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr"] Apr 16 14:19:32.380737 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:19:32.380702 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0fd8164_3693_4847_aa40_b634caa56fc9.slice/crio-b20c85bfcb8eb120277109e0c6797b286b04f65aafdd3a9e791fefd8ea2b0005 WatchSource:0}: Error finding container b20c85bfcb8eb120277109e0c6797b286b04f65aafdd3a9e791fefd8ea2b0005: Status 404 returned error can't find the container with id b20c85bfcb8eb120277109e0c6797b286b04f65aafdd3a9e791fefd8ea2b0005 Apr 16 14:19:32.434345 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.434246 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx"] Apr 16 14:19:32.438797 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:19:32.438740 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed9aa88_fcbc_4dbf_b4d4_6408e11815d3.slice/crio-fb23f07519c131b14920f4c43afeefef2bd1bea6e0d18f1211cb9baee71c6954 WatchSource:0}: Error finding container fb23f07519c131b14920f4c43afeefef2bd1bea6e0d18f1211cb9baee71c6954: Status 404 returned error can't find the container with id fb23f07519c131b14920f4c43afeefef2bd1bea6e0d18f1211cb9baee71c6954 Apr 16 14:19:32.743382 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.743295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" event={"ID":"b0fd8164-3693-4847-aa40-b634caa56fc9","Type":"ContainerStarted","Data":"63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e"} Apr 16 14:19:32.743382 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.743337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" event={"ID":"b0fd8164-3693-4847-aa40-b634caa56fc9","Type":"ContainerStarted","Data":"b20c85bfcb8eb120277109e0c6797b286b04f65aafdd3a9e791fefd8ea2b0005"} Apr 16 14:19:32.743382 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.743360 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" Apr 16 14:19:32.744716 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.744686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" event={"ID":"7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3","Type":"ContainerStarted","Data":"472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be"} Apr 16 14:19:32.744716 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.744717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" event={"ID":"7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3","Type":"ContainerStarted","Data":"fb23f07519c131b14920f4c43afeefef2bd1bea6e0d18f1211cb9baee71c6954"} Apr 16 14:19:32.744880 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.744868 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" Apr 16 14:19:32.745031 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.745010 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 14:19:32.745807 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.745784 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 14:19:32.760937 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.760895 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podStartSLOduration=0.760882283 podStartE2EDuration="760.882283ms" podCreationTimestamp="2026-04-16 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:32.759927372 +0000 UTC m=+1228.803960558" watchObservedRunningTime="2026-04-16 14:19:32.760882283 +0000 UTC m=+1228.804915475" Apr 16 14:19:32.776358 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:32.776309 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podStartSLOduration=0.776287129 podStartE2EDuration="776.287129ms" podCreationTimestamp="2026-04-16 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:32.774777242 +0000 UTC m=+1228.818810439" watchObservedRunningTime="2026-04-16 14:19:32.776287129 +0000 UTC m=+1228.820320317" Apr 16 14:19:33.466087 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:33.466040 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 14:19:33.466433 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:33.466040 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 14:19:33.748277 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:33.748189 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 14:19:33.748458 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:33.748273 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 14:19:35.575281 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.575252 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" Apr 16 14:19:35.755830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.755737 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerID="db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461" exitCode=0 Apr 16 14:19:35.755830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.755807 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" Apr 16 14:19:35.755830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.755814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" event={"ID":"0f4e5e05-a06c-4a75-b369-f49db00cd63d","Type":"ContainerDied","Data":"db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461"} Apr 16 14:19:35.756095 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.755850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7" event={"ID":"0f4e5e05-a06c-4a75-b369-f49db00cd63d","Type":"ContainerDied","Data":"fa5a6271bf31d7bb08810a9d3adaab82db5f3bc9756ab012b0c5acb7b987b7fa"} Apr 16 14:19:35.756095 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.755864 2573 scope.go:117] "RemoveContainer" containerID="db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461" Apr 16 14:19:35.766916 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.766897 2573 scope.go:117] "RemoveContainer" containerID="db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461" Apr 16 14:19:35.767155 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:19:35.767135 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461\": container with ID starting with db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461 not found: ID does not exist" containerID="db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461" Apr 16 14:19:35.767194 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.767163 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461"} err="failed to get container status \"db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461\": rpc error: code = NotFound desc = could not find container \"db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461\": container with ID starting with db21d61d7370825bc078201c74aeb59f887369733bfeeac24eec7225a207e461 not found: ID does not exist" Apr 16 14:19:35.779074 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.779045 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7"] Apr 16 14:19:35.782993 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:35.782973 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-305b2-predictor-58965b9845-7m8b7"] Apr 16 14:19:36.292213 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.292189 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" Apr 16 14:19:36.516693 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.516603 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" path="/var/lib/kubelet/pods/0f4e5e05-a06c-4a75-b369-f49db00cd63d/volumes" Apr 16 14:19:36.760817 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.760778 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerID="d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b" exitCode=0 Apr 16 14:19:36.761255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.760850 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" Apr 16 14:19:36.761255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.760849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" event={"ID":"5c0b3289-80af-4bbe-a292-2c620711f4c5","Type":"ContainerDied","Data":"d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b"} Apr 16 14:19:36.761255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.760956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn" event={"ID":"5c0b3289-80af-4bbe-a292-2c620711f4c5","Type":"ContainerDied","Data":"439155c3115617060df8f80c642077a037fb36c9900041334e6941e2866695fc"} Apr 16 14:19:36.761255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.760975 2573 scope.go:117] "RemoveContainer" containerID="d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b" Apr 16 14:19:36.769156 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.769128 2573 scope.go:117] "RemoveContainer" containerID="d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b" Apr 16 14:19:36.769415 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:19:36.769387 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b\": container with ID starting with d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b not found: ID does not exist" containerID="d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b" Apr 16 14:19:36.769470 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.769427 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b"} err="failed to get container status \"d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b\": rpc error: code = NotFound desc = could not find container \"d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b\": container with ID starting with d0208892fd8bcdcdf6e1fb69e14b00da40214be8e1d1c8cf25225fb5c973622b not found: ID does not exist" Apr 16 14:19:36.778055 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.778004 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn"] Apr 16 14:19:36.780057 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:36.780036 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-305b2-predictor-7466c57bb4-tjdwn"] Apr 16 14:19:38.516760 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:38.516720 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" path="/var/lib/kubelet/pods/5c0b3289-80af-4bbe-a292-2c620711f4c5/volumes" Apr 16 14:19:39.596510 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:39.596477 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" Apr 16 14:19:39.597003 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:39.596919 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" Apr 16 14:19:43.748738 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:43.748689 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 14:19:43.749106 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:43.748690 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 14:19:53.748785 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:53.748744 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 14:19:53.749171 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:19:53.748745 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 14:20:03.749265 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:03.749217 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 14:20:03.749629 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:03.749217 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 14:20:07.926440 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.926407 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4"] Apr 16 14:20:07.927026 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.926728 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" containerID="cri-o://b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c" gracePeriod=30 Apr 16 14:20:07.956700 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.956650 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787"] Apr 16 14:20:07.957093 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.957078 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" Apr 16 14:20:07.957137 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.957097 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" Apr 16 14:20:07.957137 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.957108 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" Apr 16 14:20:07.957137 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.957114 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" Apr 16 14:20:07.957235 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.957183 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f4e5e05-a06c-4a75-b369-f49db00cd63d" containerName="kserve-container" Apr 16 14:20:07.957235 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.957193 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c0b3289-80af-4bbe-a292-2c620711f4c5" containerName="kserve-container" Apr 16 14:20:07.961542 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.961523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" Apr 16 14:20:07.968987 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.968885 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787"] Apr 16 14:20:07.973031 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:07.973011 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" Apr 16 14:20:08.010785 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.010718 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft"] Apr 16 14:20:08.011121 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.011061 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" containerID="cri-o://5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881" gracePeriod=30 Apr 16 14:20:08.045811 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.045779 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7"] Apr 16 14:20:08.053865 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.053832 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7"] Apr 16 14:20:08.054182 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.054122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" Apr 16 14:20:08.070218 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.070103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" Apr 16 14:20:08.129054 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.129017 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787"] Apr 16 14:20:08.417390 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.417362 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7"] Apr 16 14:20:08.419719 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:20:08.419689 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87dc1303_63f2_4666_b43e_680609ed55ac.slice/crio-55b90eeb9e2565f61af11b794395121c59bdcf077e52a42f8f5faefc6318ca12 WatchSource:0}: Error finding container 55b90eeb9e2565f61af11b794395121c59bdcf077e52a42f8f5faefc6318ca12: Status 404 returned error can't find the container with id 55b90eeb9e2565f61af11b794395121c59bdcf077e52a42f8f5faefc6318ca12 Apr 16 14:20:08.867407 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.867363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" event={"ID":"c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5","Type":"ContainerStarted","Data":"60d22cfe0068e4c12ab949e02a80bd26b99ae2c9cf1e8d889f3bb7e0558dff9f"} Apr 16 14:20:08.867407 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.867407 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" Apr 16 14:20:08.867705 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.867420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" event={"ID":"c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5","Type":"ContainerStarted","Data":"7b82f5298542c20a2df22ec6d880c6e8d5e732525042fcf36e76c19bcef4143d"} Apr 16 14:20:08.868859 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.868835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" event={"ID":"87dc1303-63f2-4666-b43e-680609ed55ac","Type":"ContainerStarted","Data":"2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7"} Apr 16 14:20:08.868970 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.868864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" event={"ID":"87dc1303-63f2-4666-b43e-680609ed55ac","Type":"ContainerStarted","Data":"55b90eeb9e2565f61af11b794395121c59bdcf077e52a42f8f5faefc6318ca12"} Apr 16 14:20:08.869069 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.869048 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" Apr 16 14:20:08.869196 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.869173 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 14:20:08.869998 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.869975 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 14:20:08.882037 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.881991 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podStartSLOduration=1.881979017 podStartE2EDuration="1.881979017s" podCreationTimestamp="2026-04-16 14:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:08.881243517 +0000 UTC m=+1264.925276706" watchObservedRunningTime="2026-04-16 14:20:08.881979017 +0000 UTC m=+1264.926012200" Apr 16 14:20:08.895299 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:08.895262 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podStartSLOduration=0.895250087 podStartE2EDuration="895.250087ms" podCreationTimestamp="2026-04-16 14:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:08.89373126 +0000 UTC m=+1264.937764447" watchObservedRunningTime="2026-04-16 14:20:08.895250087 +0000 UTC m=+1264.939283273" Apr 16 14:20:09.595152 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:09.595107 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 14:20:09.595516 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:09.595110 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 14:20:09.872826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:09.872731 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 14:20:09.872826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:09.872786 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 14:20:11.673944 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.673921 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" Apr 16 14:20:11.880817 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.880776 2573 generic.go:358] "Generic (PLEG): container finished" podID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerID="b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c" exitCode=0 Apr 16 14:20:11.880994 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.880833 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" Apr 16 14:20:11.880994 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.880859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" event={"ID":"80f8aa8c-051a-4212-9729-ac903b1beae6","Type":"ContainerDied","Data":"b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c"} Apr 16 14:20:11.880994 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.880898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4" event={"ID":"80f8aa8c-051a-4212-9729-ac903b1beae6","Type":"ContainerDied","Data":"363d6916107ba01a380b321e482e50d2f6ac137a6fca095123bfdf3b5209be98"} Apr 16 14:20:11.880994 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.880915 2573 scope.go:117] "RemoveContainer" containerID="b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c" Apr 16 14:20:11.889464 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.889447 2573 scope.go:117] "RemoveContainer" containerID="b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c" Apr 16 14:20:11.889735 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:20:11.889705 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c\": container with ID starting with b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c not found: ID does not exist" containerID="b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c" Apr 16 14:20:11.889836 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.889741 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c"} err="failed to get container status \"b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c\": rpc error: code = NotFound desc = could not find container \"b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c\": container with ID starting with b04e006c02b41256fc4d1cc4430981b7a11446738c2f2d19d36eadfbea5f254c not found: ID does not exist" Apr 16 14:20:11.900962 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.900930 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4"] Apr 16 14:20:11.903938 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:11.903916 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a84b0-predictor-6b596fcd54-8mjt4"] Apr 16 14:20:12.181448 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.181424 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" Apr 16 14:20:12.517546 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.517470 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" path="/var/lib/kubelet/pods/80f8aa8c-051a-4212-9729-ac903b1beae6/volumes" Apr 16 14:20:12.886509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.886469 2573 generic.go:358] "Generic (PLEG): container finished" podID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerID="5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881" exitCode=0 Apr 16 14:20:12.886968 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.886559 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" Apr 16 14:20:12.886968 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.886558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" event={"ID":"1f4d5611-1b45-4ad6-b88d-9467d30c8490","Type":"ContainerDied","Data":"5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881"} Apr 16 14:20:12.886968 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.886602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft" event={"ID":"1f4d5611-1b45-4ad6-b88d-9467d30c8490","Type":"ContainerDied","Data":"3ef774ab207e5d14982fc9410a0b1bdc8b156fe1fefd74773122c1809f266227"} Apr 16 14:20:12.886968 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.886622 2573 scope.go:117] "RemoveContainer" containerID="5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881" Apr 16 14:20:12.894763 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.894747 2573 scope.go:117] "RemoveContainer" containerID="5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881" Apr 16 14:20:12.895010 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:20:12.894992 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881\": container with ID starting with 5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881 not found: ID does not exist" containerID="5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881" Apr 16 14:20:12.895060 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.895018 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881"} err="failed to get container status \"5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881\": rpc error: code = NotFound desc = could not find container \"5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881\": container with ID starting with 5a41890ed2aabf93fec75745c3ecc2d4a276d36ea9e9ca9fd4dda3f3f4e69881 not found: ID does not exist" Apr 16 14:20:12.902097 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.902076 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft"] Apr 16 14:20:12.904875 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:12.904853 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a84b0-predictor-57c69c6bf5-2xkft"] Apr 16 14:20:13.748897 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:13.748856 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 14:20:13.749063 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:13.748856 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 14:20:14.517820 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:14.517778 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" path="/var/lib/kubelet/pods/1f4d5611-1b45-4ad6-b88d-9467d30c8490/volumes" Apr 16 14:20:19.872834 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:19.872784 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 14:20:19.873262 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:19.872985 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 14:20:23.749885 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:23.749850 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" Apr 16 14:20:23.752256 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:23.750197 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" Apr 16 14:20:29.872869 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:29.872825 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 14:20:29.873240 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:29.872822 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 14:20:39.873435 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:39.873388 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 14:20:39.873435 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:39.873415 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 14:20:49.873217 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:49.873176 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 14:20:49.873590 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:49.873176 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 14:20:59.873841 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:59.873806 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" Apr 16 14:20:59.874390 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:20:59.874372 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" Apr 16 14:24:04.541197 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:24:04.541173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:24:04.544206 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:24:04.544184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:24:04.544325 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:24:04.544189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:24:04.548086 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:24:04.548069 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:28:57.093846 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.093802 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr"] Apr 16 14:28:57.094337 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.094109 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" containerID="cri-o://63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e" gracePeriod=30 Apr 16 14:28:57.151623 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.151589 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx"] Apr 16 14:28:57.151907 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.151862 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" containerID="cri-o://472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be" gracePeriod=30 Apr 16 14:28:57.162114 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd"] Apr 16 14:28:57.162621 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162603 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" Apr 16 14:28:57.162706 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162626 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" Apr 16 14:28:57.162706 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162650 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" Apr 16 14:28:57.162706 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162658 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" Apr 16 14:28:57.162825 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162804 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="80f8aa8c-051a-4212-9729-ac903b1beae6" containerName="kserve-container" Apr 16 14:28:57.162825 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.162819 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f4d5611-1b45-4ad6-b88d-9467d30c8490" containerName="kserve-container" Apr 16 14:28:57.166534 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.166511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" Apr 16 14:28:57.177250 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.177232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" Apr 16 14:28:57.181749 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.181723 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd"] Apr 16 14:28:57.214153 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.214121 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd"] Apr 16 14:28:57.218395 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.218376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" Apr 16 14:28:57.230082 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.230010 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd"] Apr 16 14:28:57.231895 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.231879 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" Apr 16 14:28:57.341719 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.340797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd"] Apr 16 14:28:57.353930 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.353750 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:28:57.399982 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.399953 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd"] Apr 16 14:28:57.402013 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:28:57.401984 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4835e21_50a8_4ba1_be49_b77826a3af09.slice/crio-582e2f1382af869d2c91b11737749c3bfd4fb495f9f9c51785b61537b4b80dff WatchSource:0}: Error finding container 582e2f1382af869d2c91b11737749c3bfd4fb495f9f9c51785b61537b4b80dff: Status 404 returned error can't find the container with id 582e2f1382af869d2c91b11737749c3bfd4fb495f9f9c51785b61537b4b80dff Apr 16 14:28:57.681004 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.680903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" event={"ID":"f4835e21-50a8-4ba1-be49-b77826a3af09","Type":"ContainerStarted","Data":"3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e"} Apr 16 14:28:57.681004 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.680956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" event={"ID":"f4835e21-50a8-4ba1-be49-b77826a3af09","Type":"ContainerStarted","Data":"582e2f1382af869d2c91b11737749c3bfd4fb495f9f9c51785b61537b4b80dff"} Apr 16 14:28:57.681233 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.681194 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" Apr 16 14:28:57.682681 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.682639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" event={"ID":"80fb4dda-45cb-4076-8b3e-c28fe08b9f2a","Type":"ContainerStarted","Data":"617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2"} Apr 16 14:28:57.682813 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.682687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" event={"ID":"80fb4dda-45cb-4076-8b3e-c28fe08b9f2a","Type":"ContainerStarted","Data":"3847528db1498e8c38544e0ded415fd847cf1172af834fc0c3d357b10bcd6a16"} Apr 16 14:28:57.682813 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.682706 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" Apr 16 14:28:57.682813 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.682790 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:28:57.683637 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.683614 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:28:57.696493 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.696451 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podStartSLOduration=0.696439329 podStartE2EDuration="696.439329ms" podCreationTimestamp="2026-04-16 14:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:28:57.695355159 +0000 UTC m=+1793.739388346" watchObservedRunningTime="2026-04-16 14:28:57.696439329 +0000 UTC m=+1793.740472515" Apr 16 14:28:57.711091 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:57.711041 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podStartSLOduration=0.71102454 podStartE2EDuration="711.02454ms" podCreationTimestamp="2026-04-16 14:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:28:57.70975845 +0000 UTC m=+1793.753791637" watchObservedRunningTime="2026-04-16 14:28:57.71102454 +0000 UTC m=+1793.755057728" Apr 16 14:28:58.685880 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:58.685836 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:28:58.686262 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:28:58.685836 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:29:00.669497 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.669477 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" Apr 16 14:29:00.672609 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.672589 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" Apr 16 14:29:00.693365 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.693300 2573 generic.go:358] "Generic (PLEG): container finished" podID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerID="63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e" exitCode=0 Apr 16 14:29:00.693365 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.693358 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" Apr 16 14:29:00.693529 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.693378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" event={"ID":"b0fd8164-3693-4847-aa40-b634caa56fc9","Type":"ContainerDied","Data":"63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e"} Apr 16 14:29:00.693529 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.693412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr" event={"ID":"b0fd8164-3693-4847-aa40-b634caa56fc9","Type":"ContainerDied","Data":"b20c85bfcb8eb120277109e0c6797b286b04f65aafdd3a9e791fefd8ea2b0005"} Apr 16 14:29:00.693529 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.693432 2573 scope.go:117] "RemoveContainer" containerID="63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e" Apr 16 14:29:00.694781 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.694756 2573 generic.go:358] "Generic (PLEG): container finished" podID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerID="472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be" exitCode=0 Apr 16 14:29:00.694903 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.694824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" event={"ID":"7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3","Type":"ContainerDied","Data":"472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be"} Apr 16 14:29:00.694903 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.694849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" event={"ID":"7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3","Type":"ContainerDied","Data":"fb23f07519c131b14920f4c43afeefef2bd1bea6e0d18f1211cb9baee71c6954"} Apr 16 14:29:00.695029 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.694907 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx" Apr 16 14:29:00.704576 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.704557 2573 scope.go:117] "RemoveContainer" containerID="63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e" Apr 16 14:29:00.704938 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:29:00.704917 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e\": container with ID starting with 63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e not found: ID does not exist" containerID="63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e" Apr 16 14:29:00.705030 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.704948 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e"} err="failed to get container status \"63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e\": rpc error: code = NotFound desc = could not find container \"63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e\": container with ID starting with 63c0f99e87c4f54acda0f0bab29fb48e283a88f551541fc9d0cc264fdbeea82e not found: ID does not exist" Apr 16 14:29:00.705030 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.704971 2573 scope.go:117] "RemoveContainer" containerID="472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be" Apr 16 14:29:00.713285 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.713266 2573 scope.go:117] "RemoveContainer" containerID="472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be" Apr 16 14:29:00.713514 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:29:00.713496 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be\": container with ID starting with 472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be not found: ID does not exist" containerID="472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be" Apr 16 14:29:00.713575 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.713526 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be"} err="failed to get container status \"472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be\": rpc error: code = NotFound desc = could not find container \"472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be\": container with ID starting with 472fdafe59fe6f03f00076cff280cc7c34a555b3321b111cfff80d73f32452be not found: ID does not exist" Apr 16 14:29:00.721373 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.721348 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx"] Apr 16 14:29:00.725735 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.725711 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-911ed-predictor-577cc5dbbf-bmwdx"] Apr 16 14:29:00.737497 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.737474 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr"] Apr 16 14:29:00.740126 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:00.740104 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-911ed-predictor-7885d56788-nnsbr"] Apr 16 14:29:02.517949 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:02.517915 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" path="/var/lib/kubelet/pods/7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3/volumes" Apr 16 14:29:02.518356 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:02.518155 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" path="/var/lib/kubelet/pods/b0fd8164-3693-4847-aa40-b634caa56fc9/volumes" Apr 16 14:29:04.565821 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:04.565791 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:29:04.568882 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:04.568861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:29:04.570913 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:04.570892 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:29:04.573864 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:04.573848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:29:08.686836 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:08.686791 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:29:08.687240 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:08.686792 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:29:18.686336 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:18.686289 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:29:18.686748 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:18.686289 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:29:28.686432 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:28.686387 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:29:28.686935 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:28.686387 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:29:32.843189 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.843147 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787"] Apr 16 14:29:32.843642 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.843443 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" containerID="cri-o://60d22cfe0068e4c12ab949e02a80bd26b99ae2c9cf1e8d889f3bb7e0558dff9f" gracePeriod=30 Apr 16 14:29:32.849476 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849451 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85"] Apr 16 14:29:32.849821 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849808 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" Apr 16 14:29:32.849870 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849822 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" Apr 16 14:29:32.849870 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849835 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" Apr 16 14:29:32.849870 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849841 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" Apr 16 14:29:32.849964 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849898 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ed9aa88-fcbc-4dbf-b4d4-6408e11815d3" containerName="kserve-container" Apr 16 14:29:32.849964 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.849910 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0fd8164-3693-4847-aa40-b634caa56fc9" containerName="kserve-container" Apr 16 14:29:32.852918 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.852902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" Apr 16 14:29:32.862699 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.862663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" Apr 16 14:29:32.866110 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.866089 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85"] Apr 16 14:29:32.951693 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.951613 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859"] Apr 16 14:29:32.957097 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.957019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" Apr 16 14:29:32.962652 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.961498 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859"] Apr 16 14:29:32.980984 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:32.980044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" Apr 16 14:29:33.009249 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.009204 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7"] Apr 16 14:29:33.010607 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.010572 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" containerID="cri-o://2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7" gracePeriod=30 Apr 16 14:29:33.035893 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.035800 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85"] Apr 16 14:29:33.144511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.144461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859"] Apr 16 14:29:33.147075 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:29:33.147046 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c43eb93_71a4_47cc_ba02_fee303dade36.slice/crio-2305d21bfbeef194b132d34b494081abc7b83792c1cd62e3e327f90d06539407 WatchSource:0}: Error finding container 2305d21bfbeef194b132d34b494081abc7b83792c1cd62e3e327f90d06539407: Status 404 returned error can't find the container with id 2305d21bfbeef194b132d34b494081abc7b83792c1cd62e3e327f90d06539407 Apr 16 14:29:33.824274 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.824238 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" event={"ID":"5c43eb93-71a4-47cc-ba02-fee303dade36","Type":"ContainerStarted","Data":"f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1"} Apr 16 14:29:33.824274 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.824283 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" Apr 16 14:29:33.824573 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.824298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" event={"ID":"5c43eb93-71a4-47cc-ba02-fee303dade36","Type":"ContainerStarted","Data":"2305d21bfbeef194b132d34b494081abc7b83792c1cd62e3e327f90d06539407"} Apr 16 14:29:33.825657 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.825622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" event={"ID":"fd46485d-315e-4279-8374-6ecead3d383f","Type":"ContainerStarted","Data":"10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e"} Apr 16 14:29:33.825657 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.825655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" event={"ID":"fd46485d-315e-4279-8374-6ecead3d383f","Type":"ContainerStarted","Data":"ea9b6a3d5fc799daae1d9f64fd51c070a08ddf3de3d127e1bf91db9e5cf6f87e"} Apr 16 14:29:33.825823 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.825804 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" Apr 16 14:29:33.826072 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.826048 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:29:33.826733 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.826712 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:29:33.842346 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.842290 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podStartSLOduration=1.842273938 podStartE2EDuration="1.842273938s" podCreationTimestamp="2026-04-16 14:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:29:33.839182779 +0000 UTC m=+1829.883215966" watchObservedRunningTime="2026-04-16 14:29:33.842273938 +0000 UTC m=+1829.886307125" Apr 16 14:29:33.857905 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:33.857863 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podStartSLOduration=1.8578510719999999 podStartE2EDuration="1.857851072s" podCreationTimestamp="2026-04-16 14:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:29:33.855709178 +0000 UTC m=+1829.899742364" watchObservedRunningTime="2026-04-16 14:29:33.857851072 +0000 UTC m=+1829.901884257" Apr 16 14:29:34.828815 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:34.828777 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:29:34.829025 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:34.828811 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:29:36.839836 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:36.839611 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerID="60d22cfe0068e4c12ab949e02a80bd26b99ae2c9cf1e8d889f3bb7e0558dff9f" exitCode=0 Apr 16 14:29:36.839836 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:36.839780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" event={"ID":"c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5","Type":"ContainerDied","Data":"60d22cfe0068e4c12ab949e02a80bd26b99ae2c9cf1e8d889f3bb7e0558dff9f"} Apr 16 14:29:36.913591 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:36.913564 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" Apr 16 14:29:37.258973 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.258948 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" Apr 16 14:29:37.844311 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.844278 2573 generic.go:358] "Generic (PLEG): container finished" podID="87dc1303-63f2-4666-b43e-680609ed55ac" containerID="2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7" exitCode=0 Apr 16 14:29:37.844744 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.844346 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" Apr 16 14:29:37.844744 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.844361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" event={"ID":"87dc1303-63f2-4666-b43e-680609ed55ac","Type":"ContainerDied","Data":"2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7"} Apr 16 14:29:37.844744 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.844398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7" event={"ID":"87dc1303-63f2-4666-b43e-680609ed55ac","Type":"ContainerDied","Data":"55b90eeb9e2565f61af11b794395121c59bdcf077e52a42f8f5faefc6318ca12"} Apr 16 14:29:37.844744 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.844416 2573 scope.go:117] "RemoveContainer" containerID="2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7" Apr 16 14:29:37.845650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.845624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" event={"ID":"c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5","Type":"ContainerDied","Data":"7b82f5298542c20a2df22ec6d880c6e8d5e732525042fcf36e76c19bcef4143d"} Apr 16 14:29:37.845794 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.845693 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787" Apr 16 14:29:37.852838 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.852824 2573 scope.go:117] "RemoveContainer" containerID="2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7" Apr 16 14:29:37.853131 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:29:37.853102 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7\": container with ID starting with 2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7 not found: ID does not exist" containerID="2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7" Apr 16 14:29:37.853227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.853140 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7"} err="failed to get container status \"2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7\": rpc error: code = NotFound desc = could not find container \"2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7\": container with ID starting with 2be4e3f44710db401b2900e0f994586c48d32ea9de90f73254ec2e1b7532b7b7 not found: ID does not exist" Apr 16 14:29:37.853227 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.853164 2573 scope.go:117] "RemoveContainer" containerID="60d22cfe0068e4c12ab949e02a80bd26b99ae2c9cf1e8d889f3bb7e0558dff9f" Apr 16 14:29:37.868547 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.868526 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787"] Apr 16 14:29:37.872230 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.872209 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-009d9-predictor-8d574d484-c7787"] Apr 16 14:29:37.883293 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.883272 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7"] Apr 16 14:29:37.887694 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:37.887659 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-009d9-predictor-5497b97b89-5m5l7"] Apr 16 14:29:38.517301 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:38.517268 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" path="/var/lib/kubelet/pods/87dc1303-63f2-4666-b43e-680609ed55ac/volumes" Apr 16 14:29:38.517525 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:38.517510 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" path="/var/lib/kubelet/pods/c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5/volumes" Apr 16 14:29:38.686543 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:38.686496 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:29:38.686742 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:38.686496 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:29:44.828844 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:44.828805 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:29:44.829375 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:44.828809 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:29:48.686851 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:48.686816 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" Apr 16 14:29:48.687225 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:48.687075 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" Apr 16 14:29:54.828810 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:54.828768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:29:54.828810 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:29:54.828792 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:30:04.828848 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:04.828805 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:30:04.829305 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:04.828806 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:30:14.829180 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:14.829133 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:30:14.829695 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:14.829132 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:30:17.415780 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.415747 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd"] Apr 16 14:30:17.416234 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.416101 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" containerID="cri-o://617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2" gracePeriod=30 Apr 16 14:30:17.426857 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.426831 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n"] Apr 16 14:30:17.427194 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.427183 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" Apr 16 14:30:17.427238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.427196 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" Apr 16 14:30:17.427238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.427220 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" Apr 16 14:30:17.427238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.427225 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" Apr 16 14:30:17.427370 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.427281 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="87dc1303-63f2-4666-b43e-680609ed55ac" containerName="kserve-container" Apr 16 14:30:17.427370 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.427293 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6d0d632-8fc6-49e4-a1b4-b7d48ba4e5b5" containerName="kserve-container" Apr 16 14:30:17.431798 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.431778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" Apr 16 14:30:17.444137 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.444116 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" Apr 16 14:30:17.445219 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.445200 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n"] Apr 16 14:30:17.521351 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.519471 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x"] Apr 16 14:30:17.529448 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.524582 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" Apr 16 14:30:17.529597 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.529488 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd"] Apr 16 14:30:17.529797 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.529771 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" containerID="cri-o://3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e" gracePeriod=30 Apr 16 14:30:17.531324 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.531304 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x"] Apr 16 14:30:17.547876 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.547852 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" Apr 16 14:30:17.617634 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.617514 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n"] Apr 16 14:30:17.621779 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:30:17.621729 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18d65a8_4884_4ec0_927c_19beae79bca0.slice/crio-51f1255229004aecd47ed8b13339a29afee8da3fcaa5be1fc7ab7e331f50fabb WatchSource:0}: Error finding container 51f1255229004aecd47ed8b13339a29afee8da3fcaa5be1fc7ab7e331f50fabb: Status 404 returned error can't find the container with id 51f1255229004aecd47ed8b13339a29afee8da3fcaa5be1fc7ab7e331f50fabb Apr 16 14:30:17.708299 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.707923 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x"] Apr 16 14:30:17.980509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.980405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" event={"ID":"e8443e75-8eeb-4221-805d-b01c1ec7d740","Type":"ContainerStarted","Data":"2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248"} Apr 16 14:30:17.980509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.980453 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" Apr 16 14:30:17.980509 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.980468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" event={"ID":"e8443e75-8eeb-4221-805d-b01c1ec7d740","Type":"ContainerStarted","Data":"d832cdfc0f10047564033befe27ae79a2089dd5a95237810b7b7f18a03f8bdc3"} Apr 16 14:30:17.981869 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.981843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" event={"ID":"d18d65a8-4884-4ec0-927c-19beae79bca0","Type":"ContainerStarted","Data":"3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068"} Apr 16 14:30:17.981869 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.981874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" event={"ID":"d18d65a8-4884-4ec0-927c-19beae79bca0","Type":"ContainerStarted","Data":"51f1255229004aecd47ed8b13339a29afee8da3fcaa5be1fc7ab7e331f50fabb"} Apr 16 14:30:17.982061 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.982050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" Apr 16 14:30:17.982276 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.982246 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 14:30:17.983050 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.983028 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 14:30:17.997263 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:17.997220 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podStartSLOduration=0.997208534 podStartE2EDuration="997.208534ms" podCreationTimestamp="2026-04-16 14:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:17.995708804 +0000 UTC m=+1874.039741991" watchObservedRunningTime="2026-04-16 14:30:17.997208534 +0000 UTC m=+1874.041241720" Apr 16 14:30:18.011258 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:18.011182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podStartSLOduration=1.0111695 podStartE2EDuration="1.0111695s" podCreationTimestamp="2026-04-16 14:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:18.009584143 +0000 UTC m=+1874.053617345" watchObservedRunningTime="2026-04-16 14:30:18.0111695 +0000 UTC m=+1874.055202689" Apr 16 14:30:18.686631 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:18.686590 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 14:30:18.687101 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:18.686584 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 14:30:18.985979 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:18.985886 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 14:30:18.985979 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:18.985885 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 14:30:21.187656 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.187631 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" Apr 16 14:30:21.557660 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.557634 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" Apr 16 14:30:21.996819 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.996784 2573 generic.go:358] "Generic (PLEG): container finished" podID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerID="3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e" exitCode=0 Apr 16 14:30:21.997022 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.996844 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" Apr 16 14:30:21.997022 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.996855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" event={"ID":"f4835e21-50a8-4ba1-be49-b77826a3af09","Type":"ContainerDied","Data":"3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e"} Apr 16 14:30:21.997022 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.996879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd" event={"ID":"f4835e21-50a8-4ba1-be49-b77826a3af09","Type":"ContainerDied","Data":"582e2f1382af869d2c91b11737749c3bfd4fb495f9f9c51785b61537b4b80dff"} Apr 16 14:30:21.997022 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.996893 2573 scope.go:117] "RemoveContainer" containerID="3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e" Apr 16 14:30:21.997990 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.997967 2573 generic.go:358] "Generic (PLEG): container finished" podID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerID="617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2" exitCode=0 Apr 16 14:30:21.998100 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.998018 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" event={"ID":"80fb4dda-45cb-4076-8b3e-c28fe08b9f2a","Type":"ContainerDied","Data":"617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2"} Apr 16 14:30:21.998100 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.998037 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" Apr 16 14:30:21.998100 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:21.998054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd" event={"ID":"80fb4dda-45cb-4076-8b3e-c28fe08b9f2a","Type":"ContainerDied","Data":"3847528db1498e8c38544e0ded415fd847cf1172af834fc0c3d357b10bcd6a16"} Apr 16 14:30:22.006205 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.006185 2573 scope.go:117] "RemoveContainer" containerID="3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e" Apr 16 14:30:22.006461 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:30:22.006424 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e\": container with ID starting with 3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e not found: ID does not exist" containerID="3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e" Apr 16 14:30:22.006461 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.006447 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e"} err="failed to get container status \"3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e\": rpc error: code = NotFound desc = could not find container \"3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e\": container with ID starting with 3d8826a8b3a34283e0c75eebed2b3ff568f1370eff20f245218ace923834810e not found: ID does not exist" Apr 16 14:30:22.006561 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.006469 2573 scope.go:117] "RemoveContainer" containerID="617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2" Apr 16 14:30:22.013845 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.013828 2573 scope.go:117] "RemoveContainer" containerID="617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2" Apr 16 14:30:22.014074 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:30:22.014056 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2\": container with ID starting with 617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2 not found: ID does not exist" containerID="617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2" Apr 16 14:30:22.014247 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.014084 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2"} err="failed to get container status \"617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2\": rpc error: code = NotFound desc = could not find container \"617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2\": container with ID starting with 617a3a23322a9c706b2ebc82d445e71630e9f225b9586d5873a92de32b3a07f2 not found: ID does not exist" Apr 16 14:30:22.022047 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.022025 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd"] Apr 16 14:30:22.025510 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.025490 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0b236-predictor-5d4f9447b7-gs8zd"] Apr 16 14:30:22.039388 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.039357 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd"] Apr 16 14:30:22.045189 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.045169 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0b236-predictor-66b864b-6dsgd"] Apr 16 14:30:22.516803 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.516769 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" path="/var/lib/kubelet/pods/80fb4dda-45cb-4076-8b3e-c28fe08b9f2a/volumes" Apr 16 14:30:22.517178 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:22.517013 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" path="/var/lib/kubelet/pods/f4835e21-50a8-4ba1-be49-b77826a3af09/volumes" Apr 16 14:30:24.830478 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:24.830443 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" Apr 16 14:30:24.830888 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:24.830598 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" Apr 16 14:30:28.986629 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:28.986582 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 14:30:28.987053 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:28.986582 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 14:30:38.986394 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:38.986346 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 14:30:38.986830 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:38.986346 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 14:30:48.986972 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:48.986928 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 14:30:48.987435 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:48.986926 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 14:30:58.986209 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:58.986146 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 14:30:58.986610 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:30:58.986146 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 14:31:08.987853 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:31:08.987821 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" Apr 16 14:31:08.988367 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:31:08.988224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" Apr 16 14:34:04.590083 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:34:04.590049 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:34:04.592993 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:34:04.592973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:34:04.596145 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:34:04.596128 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:34:04.599001 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:34:04.598987 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:39:04.612701 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:04.612575 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:39:04.615729 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:04.615593 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:39:04.620426 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:04.620407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:39:04.623238 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:04.623221 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:39:42.413493 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:42.413412 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n"] Apr 16 14:39:42.413973 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:42.413645 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" containerID="cri-o://3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068" gracePeriod=30 Apr 16 14:39:42.509864 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:42.509818 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x"] Apr 16 14:39:42.510151 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:42.510108 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" containerID="cri-o://2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248" gracePeriod=30 Apr 16 14:39:45.773980 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.773954 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" Apr 16 14:39:45.777175 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.777157 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" Apr 16 14:39:45.940999 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.940962 2573 generic.go:358] "Generic (PLEG): container finished" podID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerID="2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248" exitCode=0 Apr 16 14:39:45.941199 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.941021 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" Apr 16 14:39:45.941199 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.941041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" event={"ID":"e8443e75-8eeb-4221-805d-b01c1ec7d740","Type":"ContainerDied","Data":"2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248"} Apr 16 14:39:45.941199 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.941079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x" event={"ID":"e8443e75-8eeb-4221-805d-b01c1ec7d740","Type":"ContainerDied","Data":"d832cdfc0f10047564033befe27ae79a2089dd5a95237810b7b7f18a03f8bdc3"} Apr 16 14:39:45.941199 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.941096 2573 scope.go:117] "RemoveContainer" containerID="2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248" Apr 16 14:39:45.942200 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.942180 2573 generic.go:358] "Generic (PLEG): container finished" podID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerID="3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068" exitCode=0 Apr 16 14:39:45.942278 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.942228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" event={"ID":"d18d65a8-4884-4ec0-927c-19beae79bca0","Type":"ContainerDied","Data":"3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068"} Apr 16 14:39:45.942278 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.942249 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" Apr 16 14:39:45.942278 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.942253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n" event={"ID":"d18d65a8-4884-4ec0-927c-19beae79bca0","Type":"ContainerDied","Data":"51f1255229004aecd47ed8b13339a29afee8da3fcaa5be1fc7ab7e331f50fabb"} Apr 16 14:39:45.950475 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.950459 2573 scope.go:117] "RemoveContainer" containerID="2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248" Apr 16 14:39:45.950772 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:39:45.950749 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248\": container with ID starting with 2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248 not found: ID does not exist" containerID="2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248" Apr 16 14:39:45.950856 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.950783 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248"} err="failed to get container status \"2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248\": rpc error: code = NotFound desc = could not find container \"2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248\": container with ID starting with 2996372115c59ad067acbf07a4f70857b3c0c12f4f90dab0f22edb303a211248 not found: ID does not exist" Apr 16 14:39:45.950856 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.950806 2573 scope.go:117] "RemoveContainer" containerID="3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068" Apr 16 14:39:45.958228 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.958212 2573 scope.go:117] "RemoveContainer" containerID="3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068" Apr 16 14:39:45.958489 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:39:45.958471 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068\": container with ID starting with 3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068 not found: ID does not exist" containerID="3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068" Apr 16 14:39:45.958544 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.958496 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068"} err="failed to get container status \"3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068\": rpc error: code = NotFound desc = could not find container \"3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068\": container with ID starting with 3f2adcb2a0b4a67be4bbc94f697e4ab5e9b342f0fd3792654bb5be633b000068 not found: ID does not exist" Apr 16 14:39:45.962826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.962806 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x"] Apr 16 14:39:45.969490 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.969469 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a940c-predictor-cdc8fbb67-qhs6x"] Apr 16 14:39:45.981362 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.981340 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n"] Apr 16 14:39:45.985826 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:45.985808 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a940c-predictor-55498679d6-p4w6n"] Apr 16 14:39:46.516284 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:46.516241 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" path="/var/lib/kubelet/pods/d18d65a8-4884-4ec0-927c-19beae79bca0/volumes" Apr 16 14:39:46.516596 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:39:46.516576 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" path="/var/lib/kubelet/pods/e8443e75-8eeb-4221-805d-b01c1ec7d740/volumes" Apr 16 14:44:04.635911 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:44:04.635806 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:44:04.638935 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:44:04.638783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:44:04.644789 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:44:04.644770 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:44:04.647519 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:44:04.647503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:47:02.213719 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:02.213627 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859"] Apr 16 14:47:02.214196 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:02.213887 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" containerID="cri-o://f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1" gracePeriod=30 Apr 16 14:47:02.297390 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:02.297352 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85"] Apr 16 14:47:02.297631 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:02.297601 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" containerID="cri-o://10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e" gracePeriod=30 Apr 16 14:47:04.829800 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:04.829749 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 14:47:04.830168 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:04.829749 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 14:47:05.667218 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:05.667195 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" Apr 16 14:47:05.755747 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:05.755723 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" Apr 16 14:47:06.413410 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.413376 2573 generic.go:358] "Generic (PLEG): container finished" podID="fd46485d-315e-4279-8374-6ecead3d383f" containerID="10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e" exitCode=0 Apr 16 14:47:06.413893 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.413435 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" Apr 16 14:47:06.413893 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.413465 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" event={"ID":"fd46485d-315e-4279-8374-6ecead3d383f","Type":"ContainerDied","Data":"10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e"} Apr 16 14:47:06.413893 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.413507 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85" event={"ID":"fd46485d-315e-4279-8374-6ecead3d383f","Type":"ContainerDied","Data":"ea9b6a3d5fc799daae1d9f64fd51c070a08ddf3de3d127e1bf91db9e5cf6f87e"} Apr 16 14:47:06.413893 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.413530 2573 scope.go:117] "RemoveContainer" containerID="10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e" Apr 16 14:47:06.414615 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.414592 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerID="f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1" exitCode=0 Apr 16 14:47:06.414732 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.414649 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" Apr 16 14:47:06.414732 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.414657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" event={"ID":"5c43eb93-71a4-47cc-ba02-fee303dade36","Type":"ContainerDied","Data":"f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1"} Apr 16 14:47:06.414732 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.414696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859" event={"ID":"5c43eb93-71a4-47cc-ba02-fee303dade36","Type":"ContainerDied","Data":"2305d21bfbeef194b132d34b494081abc7b83792c1cd62e3e327f90d06539407"} Apr 16 14:47:06.422914 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.422892 2573 scope.go:117] "RemoveContainer" containerID="10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e" Apr 16 14:47:06.423346 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:47:06.423317 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e\": container with ID starting with 10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e not found: ID does not exist" containerID="10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e" Apr 16 14:47:06.423454 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.423352 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e"} err="failed to get container status \"10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e\": rpc error: code = NotFound desc = could not find container \"10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e\": container with ID starting with 10d2de51efd90a5b1049974ba260071f926e0f8d4927e811cd2dd774f1b6844e not found: ID does not exist" Apr 16 14:47:06.423454 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.423377 2573 scope.go:117] "RemoveContainer" containerID="f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1" Apr 16 14:47:06.432581 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.432561 2573 scope.go:117] "RemoveContainer" containerID="f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1" Apr 16 14:47:06.432875 ip-10-0-130-195 kubenswrapper[2573]: E0416 14:47:06.432849 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1\": container with ID starting with f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1 not found: ID does not exist" containerID="f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1" Apr 16 14:47:06.432966 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.432884 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1"} err="failed to get container status \"f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1\": rpc error: code = NotFound desc = could not find container \"f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1\": container with ID starting with f1c19324263c12473b23bb1292230f1a0624d43cd61c4dcb2f70986c92f9e8e1 not found: ID does not exist" Apr 16 14:47:06.436139 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.436117 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85"] Apr 16 14:47:06.441196 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.441176 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ca33-predictor-866bdfdb6-gds85"] Apr 16 14:47:06.451892 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.451865 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859"] Apr 16 14:47:06.454817 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.454796 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ca33-predictor-b9fdb764c-sr859"] Apr 16 14:47:06.517800 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.517770 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" path="/var/lib/kubelet/pods/5c43eb93-71a4-47cc-ba02-fee303dade36/volumes" Apr 16 14:47:06.518031 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:06.518019 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd46485d-315e-4279-8374-6ecead3d383f" path="/var/lib/kubelet/pods/fd46485d-315e-4279-8374-6ecead3d383f/volumes" Apr 16 14:47:30.832857 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:30.832827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-l2l88_0c5b0215-6e48-4512-a0fa-d432021b128c/global-pull-secret-syncer/0.log" Apr 16 14:47:30.931106 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:30.931076 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j8v85_19493ff7-1060-4a3e-af97-e32119792569/konnectivity-agent/0.log" Apr 16 14:47:31.053355 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:31.053318 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-195.ec2.internal_6a28db42b1f5e7c92afb50f5a79e5c07/haproxy/0.log" Apr 16 14:47:34.343507 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:34.343477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-ngv2d_9f9e0f27-a5f3-44ee-9f24-aec06b0aa130/cluster-monitoring-operator/0.log" Apr 16 14:47:34.633813 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:34.633734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ns2xt_06256a7c-b1ff-449e-8280-66dc210fe78d/node-exporter/0.log" Apr 16 14:47:34.657018 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:34.656993 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ns2xt_06256a7c-b1ff-449e-8280-66dc210fe78d/kube-rbac-proxy/0.log" Apr 16 14:47:34.683650 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:34.683631 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ns2xt_06256a7c-b1ff-449e-8280-66dc210fe78d/init-textfile/0.log" Apr 16 14:47:36.307121 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:36.307093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-6gtrd_60e08263-d0bc-469a-b7ae-b83c965fa7a3/networking-console-plugin/0.log" Apr 16 14:47:36.832055 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:36.832030 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/1.log" Apr 16 14:47:36.837189 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:36.837169 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-h9k6c_0191787a-6511-4245-a250-5fe459bf077c/console-operator/2.log" Apr 16 14:47:37.245390 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:37.245310 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-5ndkj_69eadc2d-600e-416d-90d1-59a54327ba85/download-server/0.log" Apr 16 14:47:37.641112 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:37.641084 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-gm66q_943fa4f4-07e5-4de7-9313-95a7a017b304/volume-data-source-validator/0.log" Apr 16 14:47:38.086914 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.086869 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx"] Apr 16 14:47:38.087213 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087201 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087215 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087226 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087231 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087242 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087248 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087256 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" Apr 16 14:47:38.087259 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087261 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087272 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087278 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087287 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087292 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087347 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c43eb93-71a4-47cc-ba02-fee303dade36" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087357 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8443e75-8eeb-4221-805d-b01c1ec7d740" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087364 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd46485d-315e-4279-8374-6ecead3d383f" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087372 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d18d65a8-4884-4ec0-927c-19beae79bca0" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087378 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="80fb4dda-45cb-4076-8b3e-c28fe08b9f2a" containerName="kserve-container" Apr 16 14:47:38.087511 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.087384 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4835e21-50a8-4ba1-be49-b77826a3af09" containerName="kserve-container" Apr 16 14:47:38.090630 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.090609 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.093168 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.093144 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wdzv9\"/\"openshift-service-ca.crt\"" Apr 16 14:47:38.094318 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.094299 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wdzv9\"/\"kube-root-ca.crt\"" Apr 16 14:47:38.094420 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.094316 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wdzv9\"/\"default-dockercfg-hq28d\"" Apr 16 14:47:38.099460 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.099437 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx"] Apr 16 14:47:38.191463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.191420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-lib-modules\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.191463 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.191470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkw7\" (UniqueName: \"kubernetes.io/projected/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-kube-api-access-8qkw7\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.191717 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.191497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-proc\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.191717 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.191574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-sys\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.191717 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.191628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-podres\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.281757 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.281732 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8ptvc_2e37e261-0537-4203-b8fd-2b1189bee139/dns/0.log" Apr 16 14:47:38.292093 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-podres\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292185 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-lib-modules\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292185 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkw7\" (UniqueName: \"kubernetes.io/projected/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-kube-api-access-8qkw7\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-proc\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-sys\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-podres\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-sys\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-lib-modules\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.292374 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.292344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-proc\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.300692 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.300658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkw7\" (UniqueName: \"kubernetes.io/projected/2be321ac-29a0-48c7-8305-17f5a6b6b6c3-kube-api-access-8qkw7\") pod \"perf-node-gather-daemonset-8ktkx\" (UID: \"2be321ac-29a0-48c7-8305-17f5a6b6b6c3\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.301760 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.301729 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8ptvc_2e37e261-0537-4203-b8fd-2b1189bee139/kube-rbac-proxy/0.log" Apr 16 14:47:38.401641 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.401536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:38.426714 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.426688 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rh5gj_ee74f40d-8d8e-43f7-925f-88a0b6fe4a5d/dns-node-resolver/0.log" Apr 16 14:47:38.521581 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.521554 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx"] Apr 16 14:47:38.523644 ip-10-0-130-195 kubenswrapper[2573]: W0416 14:47:38.523614 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2be321ac_29a0_48c7_8305_17f5a6b6b6c3.slice/crio-18214963d0753262670d2403179c85b26c3932fae5b9b37f15a28051555a13f6 WatchSource:0}: Error finding container 18214963d0753262670d2403179c85b26c3932fae5b9b37f15a28051555a13f6: Status 404 returned error can't find the container with id 18214963d0753262670d2403179c85b26c3932fae5b9b37f15a28051555a13f6 Apr 16 14:47:38.525211 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.525196 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:47:38.914246 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:38.914204 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kh2z9_f4089fed-dbf0-4e49-a815-79b762aba862/node-ca/0.log" Apr 16 14:47:39.526200 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:39.526163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" event={"ID":"2be321ac-29a0-48c7-8305-17f5a6b6b6c3","Type":"ContainerStarted","Data":"859a8c3d5291a409031e18f9da7c5d4ccea72f7e164c5d4cb2368476453984f0"} Apr 16 14:47:39.526200 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:39.526197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" event={"ID":"2be321ac-29a0-48c7-8305-17f5a6b6b6c3","Type":"ContainerStarted","Data":"18214963d0753262670d2403179c85b26c3932fae5b9b37f15a28051555a13f6"} Apr 16 14:47:39.526425 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:39.526275 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:39.542314 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:39.542271 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" podStartSLOduration=1.542256329 podStartE2EDuration="1.542256329s" podCreationTimestamp="2026-04-16 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:47:39.541804742 +0000 UTC m=+2915.585837928" watchObservedRunningTime="2026-04-16 14:47:39.542256329 +0000 UTC m=+2915.586289516" Apr 16 14:47:39.620084 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:39.620061 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-56b5bd7874-fvchd_fc51c8a5-99bf-42f4-9371-6d41a4f0fc91/router/0.log" Apr 16 14:47:40.015406 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:40.015378 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kqzcz_a8fbe668-dcf2-4f5d-97a8-f28c95ce8261/serve-healthcheck-canary/0.log" Apr 16 14:47:40.349769 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:40.349731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-c2xv7_26ed3c5b-e875-42d2-b496-1c3aacfc5b95/insights-operator/0.log" Apr 16 14:47:40.350637 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:40.350613 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-c2xv7_26ed3c5b-e875-42d2-b496-1c3aacfc5b95/insights-operator/1.log" Apr 16 14:47:40.371482 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:40.371461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mgk9_9e7cd354-7d8c-4218-9aa9-27bebdc77ec8/kube-rbac-proxy/0.log" Apr 16 14:47:40.391063 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:40.391043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mgk9_9e7cd354-7d8c-4218-9aa9-27bebdc77ec8/exporter/0.log" Apr 16 14:47:40.412368 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:40.412344 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mgk9_9e7cd354-7d8c-4218-9aa9-27bebdc77ec8/extractor/0.log" Apr 16 14:47:42.557605 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:42.557578 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-qhjp4_50ade766-cee1-48d9-b302-dfc852499e70/manager/0.log" Apr 16 14:47:42.580888 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:42.580856 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-w59zs_9bab7228-870d-492d-be90-dc4a62c8b740/server/0.log" Apr 16 14:47:42.810708 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:42.810595 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-b8k5j_f18b4eb6-0704-4195-9dd1-fb662aa05b1e/manager/0.log" Apr 16 14:47:45.538425 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:45.538395 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-8ktkx" Apr 16 14:47:46.284107 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:46.284077 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7twj2_2513a90f-fe9e-479f-988c-ff9994dfbf16/migrator/0.log" Apr 16 14:47:46.303653 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:46.303616 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7twj2_2513a90f-fe9e-479f-988c-ff9994dfbf16/graceful-termination/0.log" Apr 16 14:47:46.638473 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:46.638445 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-845gn_027223a7-23fc-4750-bd60-a91d4fbb3300/kube-storage-version-migrator-operator/1.log" Apr 16 14:47:46.639599 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:46.639578 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-845gn_027223a7-23fc-4750-bd60-a91d4fbb3300/kube-storage-version-migrator-operator/0.log" Apr 16 14:47:47.932651 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:47.932620 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/kube-multus-additional-cni-plugins/0.log" Apr 16 14:47:47.954833 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:47.954807 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/egress-router-binary-copy/0.log" Apr 16 14:47:47.979233 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:47.979207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/cni-plugins/0.log" Apr 16 14:47:48.002449 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.002420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/bond-cni-plugin/0.log" Apr 16 14:47:48.021660 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.021638 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/routeoverride-cni/0.log" Apr 16 14:47:48.042255 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.042236 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/whereabouts-cni-bincopy/0.log" Apr 16 14:47:48.063607 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.063586 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-twv7g_35776b6a-fb9e-462b-9392-ed0451ab2515/whereabouts-cni/0.log" Apr 16 14:47:48.150729 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.150699 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xt4tf_ca354899-9eed-44c4-8f19-33e699024e89/kube-multus/0.log" Apr 16 14:47:48.209397 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.209319 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-59j5c_45ca9ed1-9528-4529-8ffc-64027bd9e40a/network-metrics-daemon/0.log" Apr 16 14:47:48.229604 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:48.229577 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-59j5c_45ca9ed1-9528-4529-8ffc-64027bd9e40a/kube-rbac-proxy/0.log" Apr 16 14:47:49.660068 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.659988 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-controller/0.log" Apr 16 14:47:49.676109 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.676089 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/0.log" Apr 16 14:47:49.688107 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.688085 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovn-acl-logging/1.log" Apr 16 14:47:49.705548 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.705529 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/kube-rbac-proxy-node/0.log" Apr 16 14:47:49.725287 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.725268 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:47:49.744694 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.744658 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/northd/0.log" Apr 16 14:47:49.763338 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.763318 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/nbdb/0.log" Apr 16 14:47:49.790448 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.790423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/sbdb/0.log" Apr 16 14:47:49.892237 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:49.892210 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kftpl_a3c7126c-784b-47c6-88ca-c5375d70b493/ovnkube-controller/0.log" Apr 16 14:47:50.835723 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:50.835693 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-55pdr_51685b01-c9b4-47f8-98e7-5e19c0d32502/check-endpoints/0.log" Apr 16 14:47:50.880783 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:50.880749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cb8c5_c63a949a-1dfc-4b9a-a5f7-ab03e43113fb/network-check-target-container/0.log" Apr 16 14:47:51.820946 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:51.820916 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zbxjq_9473d2b0-bd27-437c-8163-34180f007c16/iptables-alerter/0.log" Apr 16 14:47:52.445582 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:52.445559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-r275w_47298b75-037b-4a3e-87aa-d3244b5c60f9/tuned/0.log" Apr 16 14:47:54.162865 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:54.162831 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-62hlh_66aa1d26-c320-4759-bcd7-99678d388133/cluster-samples-operator/0.log" Apr 16 14:47:54.178297 ip-10-0-130-195 kubenswrapper[2573]: I0416 14:47:54.178276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-62hlh_66aa1d26-c320-4759-bcd7-99678d388133/cluster-samples-operator-watch/0.log"