Apr 16 16:00:23.848932 ip-10-0-130-130 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:00:23.848947 ip-10-0-130-130 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:00:23.848957 ip-10-0-130-130 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:00:23.849290 ip-10-0-130-130 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:00:35.111278 ip-10-0-130-130 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:00:35.111302 ip-10-0-130-130 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 255d3c89f0034f0c9ea190ce57f057b7 -- Apr 16 16:03:02.364845 ip-10-0-130-130 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:03:02.830640 ip-10-0-130-130 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:03:02.830640 ip-10-0-130-130 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:03:02.830640 ip-10-0-130-130 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:03:02.830640 ip-10-0-130-130 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:03:02.830640 ip-10-0-130-130 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:03:02.834329 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.834244 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:03:02.836536 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836521 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:02.836536 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836536 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836540 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836543 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836546 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836549 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836552 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836555 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836558 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836560 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836588 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836593 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836596 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836599 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:02.836597 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836602 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836606 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836609 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836612 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836617 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836621 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836624 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836627 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836630 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836634 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836637 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836640 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836643 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836646 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836648 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836652 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836656 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836659 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836662 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:02.836910 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836664 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836667 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836669 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836672 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836675 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836677 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836680 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836682 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836685 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836687 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836690 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836692 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836695 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836697 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836700 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836704 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836707 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836710 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836712 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836715 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:02.837389 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836717 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836720 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836723 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836726 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836728 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836731 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836733 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836736 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836738 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836740 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836743 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836745 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836748 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836750 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836752 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836755 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836758 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836760 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836762 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836765 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:02.837888 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836767 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836770 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836773 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836775 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836778 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836780 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836783 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836785 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836788 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836790 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836793 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836796 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.836798 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838246 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838253 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838257 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838260 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838265 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838270 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838274 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:02.838363 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838279 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838283 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838287 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838294 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838299 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838304 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838307 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838310 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838313 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838315 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838318 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838323 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838326 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838328 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838331 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838333 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838336 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838339 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838341 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838344 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:02.838855 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838347 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838349 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838352 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838354 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838357 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838360 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838362 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838366 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838370 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838377 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838382 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838386 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838389 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838391 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838394 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838396 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838400 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838403 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838406 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838409 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:02.839352 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838412 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838414 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838417 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838419 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838422 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838425 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838427 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838430 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838433 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838435 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838438 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838440 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838443 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838446 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838453 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838457 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838464 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838470 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838473 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:02.839900 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838476 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838479 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838482 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838485 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838487 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838490 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838492 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838495 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838497 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838500 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838517 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838520 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838523 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838526 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838529 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838532 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838536 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838540 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838545 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.838549 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:02.840415 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838632 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838641 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838649 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838653 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838657 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838660 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838665 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838669 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838673 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838676 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838679 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838682 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838685 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838689 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838692 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838695 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838700 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838705 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838709 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838716 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838719 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838722 2577 flags.go:64] FLAG: --config-dir="" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838724 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838729 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:03:02.840933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838734 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838737 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838740 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838743 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838746 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838749 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838752 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838755 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838758 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838762 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838765 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838768 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838771 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838774 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838777 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838785 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838791 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838796 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838800 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838803 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838807 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838810 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838812 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838816 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838818 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:03:02.841531 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838821 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838824 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838827 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838830 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838833 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838835 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838839 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838843 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838846 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838849 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838852 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838855 2577 flags.go:64] FLAG: --help="false" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838859 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-130-130.ec2.internal" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838865 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838870 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838875 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838881 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838885 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838888 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838891 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838894 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838897 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838900 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838903 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:03:02.842133 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838906 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838910 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838912 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838915 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838918 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838921 2577 flags.go:64] FLAG: --lock-file="" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838923 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838926 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838929 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838935 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838938 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838941 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838946 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838952 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838958 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838962 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838966 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838970 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838973 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838990 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838993 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838995 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.838998 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839001 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839004 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:03:02.842731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839007 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839010 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839018 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839021 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839025 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839030 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839035 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839044 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839047 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839050 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839053 2577 flags.go:64] FLAG: --port="10250" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839056 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839059 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0450b611951f844b4" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839062 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839065 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839068 2577 flags.go:64] FLAG: --register-node="true" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839071 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839073 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839077 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839080 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839082 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839085 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839088 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839091 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839095 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:03:02.843334 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839098 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839103 2577 flags.go:64] FLAG: --runonce="false" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839106 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839111 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839116 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839121 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839126 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839130 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839133 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839136 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839139 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839141 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839145 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839148 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839151 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839154 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839157 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839163 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839166 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839168 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839173 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839176 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839179 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839182 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839185 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:03:02.843959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839189 2577 flags.go:64] FLAG: --v="2" Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839196 2577 flags.go:64] FLAG: --version="false" Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839202 2577 flags.go:64] FLAG: --vmodule="" Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839209 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839212 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839327 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839332 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839336 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839340 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839343 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839346 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839348 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839351 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839354 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839359 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839364 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839368 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839372 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839376 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839381 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:02.844571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839383 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839386 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839389 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839391 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839394 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839397 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839399 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839403 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839407 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839410 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839413 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839416 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839419 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839421 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839424 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839427 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839429 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839432 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839435 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:02.845087 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839437 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839443 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839447 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839451 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839456 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839460 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839466 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839471 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839473 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839476 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839479 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839482 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839484 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839487 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839490 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839492 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839495 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839497 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839500 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:02.845630 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839516 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839519 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839522 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839525 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839529 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839533 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839538 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839542 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839546 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839549 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839551 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839554 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839556 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839559 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839563 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839569 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839572 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839575 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839577 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839580 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:02.846089 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839585 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839588 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839591 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839593 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839596 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839598 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839601 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839604 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839606 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839609 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839614 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839619 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.839623 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.839633 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.846304 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.846320 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:03:02.846591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846375 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846381 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846384 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846387 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846390 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846393 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846397 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846400 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846403 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846405 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846408 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846411 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846413 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846416 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846418 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846421 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846424 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846426 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846429 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846433 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:02.846995 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846435 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846438 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846440 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846443 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846445 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846448 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846450 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846453 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846456 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846458 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846461 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846465 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846468 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846470 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846474 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846476 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846481 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846486 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846489 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:02.847489 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846492 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846494 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846497 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846500 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846521 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846524 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846527 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846530 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846532 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846535 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846538 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846540 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846543 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846546 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846548 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846551 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846553 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846556 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846558 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846561 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:02.847982 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846563 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846565 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846568 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846570 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846574 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846578 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846580 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846583 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846586 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846589 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846591 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846594 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846597 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846600 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846604 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846608 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846610 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846613 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846615 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846618 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:02.848554 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846620 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846623 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846625 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846628 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846630 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846633 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846635 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.846641 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846743 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846748 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846751 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846754 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846757 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846760 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846762 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:02.849069 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846765 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846768 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846771 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846775 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846778 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846781 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846783 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846786 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846788 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846791 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846793 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846796 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846798 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846801 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846803 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846806 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846809 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846811 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846814 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846817 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:02.849430 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846819 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846822 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846825 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846827 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846830 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846832 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846835 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846838 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846841 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846844 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846847 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846849 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846852 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846854 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846857 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846860 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846863 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846866 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846868 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:02.849949 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846870 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846873 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846876 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846878 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846880 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846883 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846885 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846888 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846891 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846895 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846898 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846900 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846903 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846906 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846909 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846911 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846914 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846916 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846919 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846921 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:02.850409 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846924 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846926 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846929 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846931 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846934 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846936 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846938 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846941 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846944 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846946 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846949 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846952 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846954 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846956 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846959 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846961 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846963 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846966 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846968 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:02.850924 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:02.846971 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:02.851395 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.846975 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:03:02.851395 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.847849 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:03:02.851395 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.850556 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:03:02.851488 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.851402 2577 server.go:1019] "Starting client certificate rotation" Apr 16 16:03:02.851534 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.851518 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:03:02.851564 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.851559 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:03:02.879494 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.879472 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:03:02.885289 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.885253 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:03:02.899843 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.899819 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:03:02.904898 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.904881 2577 log.go:25] "Validated CRI v1 image API" Apr 16 16:03:02.906167 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.906148 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:03:02.908684 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.908663 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 af7e0352-e82a-4361-aeac-fcd5818a8f47:/dev/nvme0n1p3 eff7c1c5-5d3c-4a3f-abb2-46db880700e5:/dev/nvme0n1p4] Apr 16 16:03:02.908745 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.908683 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:03:02.914814 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.914793 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:03:02.915070 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.914957 2577 manager.go:217] Machine: {Timestamp:2026-04-16 16:03:02.912908547 +0000 UTC m=+0.427946635 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099666 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27157daefaf7ee5595f9b489990fdb SystemUUID:ec27157d-aefa-f7ee-5595-f9b489990fdb BootID:255d3c89-f003-4f0c-9ea1-90ce57f057b7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:37:bc:94:d8:1f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:37:bc:94:d8:1f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:42:84:fc:0d:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:03:02.915644 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.915634 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:03:02.915738 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.915726 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:03:02.916726 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.916700 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:03:02.916872 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.916729 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-130.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:03:02.916925 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.916882 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:03:02.916925 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.916890 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:03:02.916925 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.916907 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:03:02.916925 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.916920 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:03:02.918462 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.918450 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:03:02.918590 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.918581 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:03:02.920936 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.920926 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:03:02.920971 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.920940 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:03:02.920971 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.920952 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:03:02.920971 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.920963 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:03:02.920971 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.920971 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:03:02.922072 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.922060 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:03:02.922120 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.922079 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:03:02.925875 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.925859 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:03:02.927783 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.927769 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:03:02.929202 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929185 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:03:02.929202 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929205 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929213 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929220 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929233 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929239 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929245 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929250 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929258 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929265 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929274 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:03:02.929314 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.929282 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:03:02.930141 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.930126 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:03:02.930141 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.930141 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:03:02.934300 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.934284 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:03:02.934383 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.934337 2577 server.go:1295] "Started kubelet" Apr 16 16:03:02.934434 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.934409 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:03:02.934559 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.934489 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:03:02.934612 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.934586 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:03:02.935097 ip-10-0-130-130 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:03:02.935665 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.935640 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:03:02.936644 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.936632 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:03:02.937730 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.937712 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-130.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:03:02.937875 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.937855 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:03:02.937944 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.937922 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:03:02.941202 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.941183 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:03:02.941694 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.941676 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:03:02.942387 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942367 2577 factory.go:55] Registering systemd factory Apr 16 16:03:02.942387 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942389 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:03:02.942616 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942597 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:03:02.942616 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942618 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:03:02.942748 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942600 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:03:02.942748 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942693 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:03:02.942748 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.942701 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:03:02.943205 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.943184 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:02.945904 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.945885 2577 factory.go:153] Registering CRI-O factory Apr 16 16:03:02.945904 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.945905 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 16:03:02.946046 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.945980 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:03:02.946046 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.946005 2577 factory.go:103] Registering Raw factory Apr 16 16:03:02.946046 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.946019 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 16:03:02.946171 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.946056 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:03:02.946217 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.946193 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:03:02.946429 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.946417 2577 manager.go:319] Starting recovery of all containers Apr 16 16:03:02.946946 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.946928 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:03:02.950558 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.946111 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-130.ec2.internal.18a6e1cf2aa6dc1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-130.ec2.internal,UID:ip-10-0-130-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-130.ec2.internal,},FirstTimestamp:2026-04-16 16:03:02.934297626 +0000 UTC m=+0.449335714,LastTimestamp:2026-04-16 16:03:02.934297626 +0000 UTC m=+0.449335714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-130.ec2.internal,}" Apr 16 16:03:02.955438 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.955416 2577 manager.go:324] Recovery completed Apr 16 16:03:02.956742 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.956710 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 16:03:02.959430 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.959418 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:02.962055 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.962030 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:02.962305 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.962252 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:02.962398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.962320 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:02.963154 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.963140 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:03:02.963255 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.963238 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:03:02.963364 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.963265 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:03:02.964869 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.964802 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-130.ec2.internal.18a6e1cf2c4e4623 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-130.ec2.internal,UID:ip-10-0-130-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-130.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-130.ec2.internal,},FirstTimestamp:2026-04-16 16:03:02.962046499 +0000 UTC m=+0.477084585,LastTimestamp:2026-04-16 16:03:02.962046499 +0000 UTC m=+0.477084585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-130.ec2.internal,}" Apr 16 16:03:02.965724 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.965707 2577 policy_none.go:49] "None policy: Start" Apr 16 16:03:02.965797 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.965737 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:03:02.966410 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.966399 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:03:02.973567 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.973483 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-130.ec2.internal.18a6e1cf2c523476 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-130.ec2.internal,UID:ip-10-0-130-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-130.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-130.ec2.internal,},FirstTimestamp:2026-04-16 16:03:02.962304118 +0000 UTC m=+0.477342218,LastTimestamp:2026-04-16 16:03:02.962304118 +0000 UTC m=+0.477342218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-130.ec2.internal,}" Apr 16 16:03:02.980100 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.980079 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rt5nr" Apr 16 16:03:02.982108 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:02.982044 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-130.ec2.internal.18a6e1cf2c528e7a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-130.ec2.internal,UID:ip-10-0-130-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-130.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-130.ec2.internal,},FirstTimestamp:2026-04-16 16:03:02.962327162 +0000 UTC m=+0.477365250,LastTimestamp:2026-04-16 16:03:02.962327162 +0000 UTC m=+0.477365250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-130.ec2.internal,}" Apr 16 16:03:02.988559 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:02.988540 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rt5nr" Apr 16 16:03:03.001047 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001032 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 16:03:03.001120 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.001066 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:03:03.001120 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001081 2577 server.go:85] "Starting device plugin registration server" Apr 16 16:03:03.001348 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001336 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:03:03.001416 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001350 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:03:03.001518 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001489 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:03:03.001704 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001576 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:03:03.001704 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.001584 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:03:03.002231 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.002207 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:03:03.002324 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.002258 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.089236 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.089160 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:03:03.090471 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.090457 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:03:03.090556 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.090492 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:03:03.090556 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.090525 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:03:03.090556 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.090533 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:03:03.090681 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.090566 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:03:03.094742 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.094718 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:03.102245 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.102231 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:03.103300 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.103284 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:03.103366 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.103317 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:03.103366 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.103327 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:03.103366 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.103353 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.111603 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.111589 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.111650 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.111613 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-130.ec2.internal\": node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.129616 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.129593 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.190959 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.190932 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal"] Apr 16 16:03:03.191051 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.191019 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:03.191990 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.191975 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:03.192058 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.192006 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:03.192058 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.192018 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:03.193483 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.193471 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:03.193655 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.193641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.193699 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.193674 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:03.194212 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.194195 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:03.194275 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.194225 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:03.194275 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.194241 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:03.194275 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.194250 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:03.194275 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.194276 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:03.194405 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.194286 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:03.196064 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.196051 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.196106 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.196075 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:03.197010 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.196996 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:03.197083 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.197023 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:03.197083 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.197033 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:03.217078 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.217059 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-130.ec2.internal\" not found" node="ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.220949 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.220935 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-130.ec2.internal\" not found" node="ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.229979 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.229954 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.244804 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.244782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ac49a4c0565840b23e57dfca6108f86-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal\" (UID: \"2ac49a4c0565840b23e57dfca6108f86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.244883 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.244809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4a6fddb6306d2ef465913be401643923-config\") pod \"kube-apiserver-proxy-ip-10-0-130-130.ec2.internal\" (UID: \"4a6fddb6306d2ef465913be401643923\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.244883 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.244829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ac49a4c0565840b23e57dfca6108f86-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal\" (UID: \"2ac49a4c0565840b23e57dfca6108f86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.330502 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.330478 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.346004 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.345948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ac49a4c0565840b23e57dfca6108f86-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal\" (UID: \"2ac49a4c0565840b23e57dfca6108f86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.346004 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.345964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ac49a4c0565840b23e57dfca6108f86-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal\" (UID: \"2ac49a4c0565840b23e57dfca6108f86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.346117 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.346015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ac49a4c0565840b23e57dfca6108f86-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal\" (UID: \"2ac49a4c0565840b23e57dfca6108f86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.346117 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.346033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4a6fddb6306d2ef465913be401643923-config\") pod \"kube-apiserver-proxy-ip-10-0-130-130.ec2.internal\" (UID: \"4a6fddb6306d2ef465913be401643923\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.346117 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.346059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4a6fddb6306d2ef465913be401643923-config\") pod \"kube-apiserver-proxy-ip-10-0-130-130.ec2.internal\" (UID: \"4a6fddb6306d2ef465913be401643923\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.346117 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.346088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ac49a4c0565840b23e57dfca6108f86-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal\" (UID: \"2ac49a4c0565840b23e57dfca6108f86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.430563 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.430535 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.521065 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.521040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.522976 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.522957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" Apr 16 16:03:03.531525 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.531493 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.632096 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.632012 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.732620 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.732589 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.833119 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.833087 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.851642 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.851617 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:03:03.851759 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.851744 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:03:03.933205 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:03.933170 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:03.941577 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.941560 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:03:03.954582 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.954561 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:03:03.976156 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.976131 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lsdpp" Apr 16 16:03:03.985964 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.985940 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lsdpp" Apr 16 16:03:03.991129 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.991104 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:58:02 +0000 UTC" deadline="2027-09-12 08:16:55.943218332 +0000 UTC" Apr 16 16:03:03.991129 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:03.991129 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12328h13m51.952092317s" Apr 16 16:03:04.033371 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.033339 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:04.111426 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:04.111388 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a6fddb6306d2ef465913be401643923.slice/crio-89df03ac85a3aa774103320b25785e49df4a6fdbea3b3b0f85fab06b4baf432c WatchSource:0}: Error finding container 89df03ac85a3aa774103320b25785e49df4a6fdbea3b3b0f85fab06b4baf432c: Status 404 returned error can't find the container with id 89df03ac85a3aa774103320b25785e49df4a6fdbea3b3b0f85fab06b4baf432c Apr 16 16:03:04.111688 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:04.111655 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac49a4c0565840b23e57dfca6108f86.slice/crio-e3ae66b3fb1589feb579453756c78306c11b4437474ea71535e2f777bc5fb170 WatchSource:0}: Error finding container e3ae66b3fb1589feb579453756c78306c11b4437474ea71535e2f777bc5fb170: Status 404 returned error can't find the container with id e3ae66b3fb1589feb579453756c78306c11b4437474ea71535e2f777bc5fb170 Apr 16 16:03:04.117057 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.117038 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:03:04.134248 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.134223 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:04.187134 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.187061 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:04.234810 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.234784 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:04.335400 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.335372 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:04.357204 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.357183 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:04.436327 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.436297 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-130.ec2.internal\" not found" Apr 16 16:03:04.535371 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.535297 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:04.542085 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.542062 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" Apr 16 16:03:04.556446 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.556403 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:03:04.557354 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.557331 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" Apr 16 16:03:04.566593 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.566570 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:03:04.922580 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.922544 2577 apiserver.go:52] "Watching apiserver" Apr 16 16:03:04.930313 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.930287 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:03:04.931072 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.931046 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-blk6n","openshift-multus/network-metrics-daemon-2mqsw","openshift-network-diagnostics/network-check-target-5n846","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5","openshift-image-registry/node-ca-7mgkb","openshift-multus/multus-vshnw","openshift-network-operator/iptables-alerter-jf4lx","openshift-ovn-kubernetes/ovnkube-node-h6xk8","kube-system/konnectivity-agent-mpcnj","kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal","openshift-cluster-node-tuning-operator/tuned-6q4mf","openshift-dns/node-resolver-chvcc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal"] Apr 16 16:03:04.934893 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.934869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:04.935061 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.934965 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:04.936182 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.936162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:04.936280 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:04.936219 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:04.937449 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.937421 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:04.939109 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.938677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.940363 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.939882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.941321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.940816 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:03:04.941321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.940965 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-p2x9m\"" Apr 16 16:03:04.941321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.940990 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.941321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.940818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.941321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.941127 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.941321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.941232 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.941667 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.941414 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:03:04.941830 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.941810 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lpsvq\"" Apr 16 16:03:04.941922 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.941870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:03:04.942181 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.942161 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:03:04.942471 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.942450 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hfdxs\"" Apr 16 16:03:04.942628 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.942610 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.942737 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.942722 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.942821 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.942803 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:04.944273 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.944254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.944599 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.944387 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.944836 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.944816 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:03:04.945060 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.945043 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.945186 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.945169 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fkwk5\"" Apr 16 16:03:04.945249 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.945217 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.946015 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.945998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:04.946585 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.946567 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:03:04.946694 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.946604 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:03:04.946899 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.946882 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:03:04.946973 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.946940 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.947109 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947093 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:03:04.947248 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947100 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.947333 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947320 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gqfjh\"" Apr 16 16:03:04.947459 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947445 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-94srw\"" Apr 16 16:03:04.947602 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947582 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:03:04.947781 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:03:04.948006 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.947988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:04.948109 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.948095 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:03:04.948364 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.948345 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cm6fv\"" Apr 16 16:03:04.948638 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.948611 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:03:04.949580 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.949561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:04.950022 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.950004 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kd9ss\"" Apr 16 16:03:04.950360 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.950291 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.950454 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.950407 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.951899 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.951880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:03:04.952035 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.952014 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:03:04.952103 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.952090 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-279dv\"" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-cni-bin\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-os-release\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-cni-multus\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-systemd\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-systemd-units\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-etc-selinux\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-cni-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-socket-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-hostroot\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-cni-netd\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.955767 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovnkube-script-lib\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-k8s-cni-cncf-io\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgd2z\" (UniqueName: \"kubernetes.io/projected/5e6c7798-3e64-45d4-88fa-1f044dd7030c-kube-api-access-qgd2z\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-var-lib-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955869 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-cnibin\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955892 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-daemon-config\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cnibin\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-slash\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-ovn\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.955994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-run-ovn-kubernetes\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovnkube-config\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-env-overrides\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-sys-fs\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-kubelet\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-multus-certs\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-system-cni-dir\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.956445 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cni-binary-copy\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/28ab87e7-a174-44d1-a00c-16f49134a9b5-serviceca\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9b55\" (UniqueName: \"kubernetes.io/projected/4cc6472e-53b0-4010-9c7e-5fae56b32e00-kube-api-access-d9b55\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-kubelet\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-etc-kubernetes\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7jm\" (UniqueName: \"kubernetes.io/projected/adb1352d-acea-4d2f-aff2-10575539bfae-kube-api-access-wd7jm\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e6c7798-3e64-45d4-88fa-1f044dd7030c-cni-binary-copy\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-etc-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqj2\" (UniqueName: \"kubernetes.io/projected/50de6307-e641-41fd-b41a-1f8634f5c208-kube-api-access-mgqj2\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28ab87e7-a174-44d1-a00c-16f49134a9b5-host\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2k8k\" (UniqueName: \"kubernetes.io/projected/28ab87e7-a174-44d1-a00c-16f49134a9b5-kube-api-access-t2k8k\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-log-socket\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-socket-dir-parent\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.957318 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-cni-bin\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-node-log\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-registration-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-netns\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adb1352d-acea-4d2f-aff2-10575539bfae-host-slash\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25ms\" (UniqueName: \"kubernetes.io/projected/dd5274ed-46c8-46c2-a74e-26859678b08d-kube-api-access-s25ms\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovn-node-metrics-cert\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d9kw\" (UniqueName: \"kubernetes.io/projected/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-kube-api-access-4d9kw\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/adb1352d-acea-4d2f-aff2-10575539bfae-iptables-alerter-script\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-device-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-system-cni-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.956982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-conf-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:04.958019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.957007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-os-release\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:04.958870 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.957047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-run-netns\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:04.987567 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.987536 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:58:03 +0000 UTC" deadline="2027-12-18 04:43:53.903542259 +0000 UTC" Apr 16 16:03:04.987567 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:04.987566 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14652h40m48.915979669s" Apr 16 16:03:05.044414 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.044341 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:03:05.057640 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/adb1352d-acea-4d2f-aff2-10575539bfae-iptables-alerter-script\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.057794 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-device-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.057794 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-system-cni-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.057794 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-conf-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.057794 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-system-cni-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.057794 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057769 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-os-release\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-run-netns\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-cni-bin\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-run\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-os-release\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-os-release\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-cni-bin\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-device-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-run-netns\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-cni-multus\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-systemd\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-conf-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-cni-multus\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.057998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-systemd\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-systemd-units\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-os-release\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b105e59-460b-469d-be97-f4653f502e92-konnectivity-ca\") pod \"konnectivity-agent-mpcnj\" (UID: \"6b105e59-460b-469d-be97-f4653f502e92\") " pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-systemd-units\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-var-lib-kubelet\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-etc-selinux\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-cni-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysctl-conf\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/adb1352d-acea-4d2f-aff2-10575539bfae-iptables-alerter-script\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058226 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-lib-modules\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-host\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-etc-selinux\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-cni-dir\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-socket-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-hostroot\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-socket-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-hostroot\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.058714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-cni-netd\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovnkube-script-lib\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xb2\" (UniqueName: \"kubernetes.io/projected/a4144120-204d-44b5-9a92-fb19a8e03118-kube-api-access-79xb2\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-k8s-cni-cncf-io\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgd2z\" (UniqueName: \"kubernetes.io/projected/5e6c7798-3e64-45d4-88fa-1f044dd7030c-kube-api-access-qgd2z\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-var-lib-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-cni-netd\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-cnibin\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-k8s-cni-cncf-io\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-daemon-config\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cnibin\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-sys\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-slash\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-var-lib-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-ovn\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-cnibin\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-run-ovn-kubernetes\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-slash\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.059562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovnkube-config\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-env-overrides\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b105e59-460b-469d-be97-f4653f502e92-agent-certs\") pod \"konnectivity-agent-mpcnj\" (UID: \"6b105e59-460b-469d-be97-f4653f502e92\") " pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-modprobe-d\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng42l\" (UniqueName: \"kubernetes.io/projected/14943d57-3e54-4ff5-8849-7eccefbe2aa1-kube-api-access-ng42l\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cnibin\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-sys-fs\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-ovn\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-kubelet\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-multus-certs\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-sys-fs\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-system-cni-dir\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-run-ovn-kubernetes\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.058998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cni-binary-copy\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-multus-certs\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-kubelet\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.060208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovnkube-script-lib\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-tuned\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-system-cni-dir\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/28ab87e7-a174-44d1-a00c-16f49134a9b5-serviceca\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-env-overrides\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovnkube-config\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-daemon-config\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.059209 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9b55\" (UniqueName: \"kubernetes.io/projected/4cc6472e-53b0-4010-9c7e-5fae56b32e00-kube-api-access-d9b55\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-kubelet\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.059393 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:05.559342193 +0000 UTC m=+3.074380270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-kubelet\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a4144120-204d-44b5-9a92-fb19a8e03118-hosts-file\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059525 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cni-binary-copy\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-etc-kubernetes\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7jm\" (UniqueName: \"kubernetes.io/projected/adb1352d-acea-4d2f-aff2-10575539bfae-kube-api-access-wd7jm\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.060825 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e6c7798-3e64-45d4-88fa-1f044dd7030c-cni-binary-copy\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059634 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/28ab87e7-a174-44d1-a00c-16f49134a9b5-serviceca\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-etc-kubernetes\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-etc-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-etc-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqj2\" (UniqueName: \"kubernetes.io/projected/50de6307-e641-41fd-b41a-1f8634f5c208-kube-api-access-mgqj2\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28ab87e7-a174-44d1-a00c-16f49134a9b5-host\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2k8k\" (UniqueName: \"kubernetes.io/projected/28ab87e7-a174-44d1-a00c-16f49134a9b5-kube-api-access-t2k8k\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28ab87e7-a174-44d1-a00c-16f49134a9b5-host\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-log-socket\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysconfig\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-log-socket\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059924 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-kubernetes\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-socket-dir-parent\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061398 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.059992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-cni-bin\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-multus-socket-dir-parent\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-node-log\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-var-lib-cni-bin\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e6c7798-3e64-45d4-88fa-1f044dd7030c-cni-binary-copy\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysctl-d\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-node-log\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14943d57-3e54-4ff5-8849-7eccefbe2aa1-tmp\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-registration-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-netns\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a4144120-204d-44b5-9a92-fb19a8e03118-tmp-dir\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adb1352d-acea-4d2f-aff2-10575539bfae-host-slash\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s25ms\" (UniqueName: \"kubernetes.io/projected/dd5274ed-46c8-46c2-a74e-26859678b08d-kube-api-access-s25ms\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4cc6472e-53b0-4010-9c7e-5fae56b32e00-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e6c7798-3e64-45d4-88fa-1f044dd7030c-host-run-netns\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.061942 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-registration-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovn-node-metrics-cert\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50de6307-e641-41fd-b41a-1f8634f5c208-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adb1352d-acea-4d2f-aff2-10575539bfae-host-slash\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d9kw\" (UniqueName: \"kubernetes.io/projected/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-kube-api-access-4d9kw\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-systemd\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-run-openvswitch\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060697 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:03:05.062436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.060977 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4cc6472e-53b0-4010-9c7e-5fae56b32e00-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.063837 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.063819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-ovn-node-metrics-cert\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.070608 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.070286 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:05.070608 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.070306 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:05.070608 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.070317 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:05.070608 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.070372 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:03:05.570354304 +0000 UTC m=+3.085392378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:05.072129 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.072105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqj2\" (UniqueName: \"kubernetes.io/projected/50de6307-e641-41fd-b41a-1f8634f5c208-kube-api-access-mgqj2\") pod \"aws-ebs-csi-driver-node-cghg5\" (UID: \"50de6307-e641-41fd-b41a-1f8634f5c208\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.072304 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.072101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7jm\" (UniqueName: \"kubernetes.io/projected/adb1352d-acea-4d2f-aff2-10575539bfae-kube-api-access-wd7jm\") pod \"iptables-alerter-jf4lx\" (UID: \"adb1352d-acea-4d2f-aff2-10575539bfae\") " pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.072717 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.072677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25ms\" (UniqueName: \"kubernetes.io/projected/dd5274ed-46c8-46c2-a74e-26859678b08d-kube-api-access-s25ms\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:05.073063 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.073027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgd2z\" (UniqueName: \"kubernetes.io/projected/5e6c7798-3e64-45d4-88fa-1f044dd7030c-kube-api-access-qgd2z\") pod \"multus-vshnw\" (UID: \"5e6c7798-3e64-45d4-88fa-1f044dd7030c\") " pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.073672 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.073629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2k8k\" (UniqueName: \"kubernetes.io/projected/28ab87e7-a174-44d1-a00c-16f49134a9b5-kube-api-access-t2k8k\") pod \"node-ca-7mgkb\" (UID: \"28ab87e7-a174-44d1-a00c-16f49134a9b5\") " pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.074619 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.074548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9b55\" (UniqueName: \"kubernetes.io/projected/4cc6472e-53b0-4010-9c7e-5fae56b32e00-kube-api-access-d9b55\") pod \"multus-additional-cni-plugins-blk6n\" (UID: \"4cc6472e-53b0-4010-9c7e-5fae56b32e00\") " pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.074619 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.074549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d9kw\" (UniqueName: \"kubernetes.io/projected/90e6fc2e-70a5-41d5-9a11-ce841bf5eabf-kube-api-access-4d9kw\") pod \"ovnkube-node-h6xk8\" (UID: \"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf\") " pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.095891 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.095848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" event={"ID":"2ac49a4c0565840b23e57dfca6108f86","Type":"ContainerStarted","Data":"e3ae66b3fb1589feb579453756c78306c11b4437474ea71535e2f777bc5fb170"} Apr 16 16:03:05.096958 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.096927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" event={"ID":"4a6fddb6306d2ef465913be401643923","Type":"ContainerStarted","Data":"89df03ac85a3aa774103320b25785e49df4a6fdbea3b3b0f85fab06b4baf432c"} Apr 16 16:03:05.158027 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.157988 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:05.160786 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysconfig\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160786 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-kubernetes\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysctl-d\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14943d57-3e54-4ff5-8849-7eccefbe2aa1-tmp\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a4144120-204d-44b5-9a92-fb19a8e03118-tmp-dir\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-kubernetes\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-systemd\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysconfig\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-run\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-systemd\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.160991 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.160987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b105e59-460b-469d-be97-f4653f502e92-konnectivity-ca\") pod \"konnectivity-agent-mpcnj\" (UID: \"6b105e59-460b-469d-be97-f4653f502e92\") " pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-var-lib-kubelet\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-run\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-var-lib-kubelet\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysctl-d\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysctl-conf\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-lib-modules\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-host\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79xb2\" (UniqueName: \"kubernetes.io/projected/a4144120-204d-44b5-9a92-fb19a8e03118-kube-api-access-79xb2\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-host\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a4144120-204d-44b5-9a92-fb19a8e03118-tmp-dir\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-sys\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b105e59-460b-469d-be97-f4653f502e92-agent-certs\") pod \"konnectivity-agent-mpcnj\" (UID: \"6b105e59-460b-469d-be97-f4653f502e92\") " pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-sysctl-conf\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-lib-modules\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.161400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-sys\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-modprobe-d\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng42l\" (UniqueName: \"kubernetes.io/projected/14943d57-3e54-4ff5-8849-7eccefbe2aa1-kube-api-access-ng42l\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-tuned\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a4144120-204d-44b5-9a92-fb19a8e03118-hosts-file\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b105e59-460b-469d-be97-f4653f502e92-konnectivity-ca\") pod \"konnectivity-agent-mpcnj\" (UID: \"6b105e59-460b-469d-be97-f4653f502e92\") " pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-modprobe-d\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.162325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.161633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a4144120-204d-44b5-9a92-fb19a8e03118-hosts-file\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.163992 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.163969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b105e59-460b-469d-be97-f4653f502e92-agent-certs\") pod \"konnectivity-agent-mpcnj\" (UID: \"6b105e59-460b-469d-be97-f4653f502e92\") " pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.167605 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.167577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14943d57-3e54-4ff5-8849-7eccefbe2aa1-tmp\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.167710 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.167614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14943d57-3e54-4ff5-8849-7eccefbe2aa1-etc-tuned\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.178885 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.178833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng42l\" (UniqueName: \"kubernetes.io/projected/14943d57-3e54-4ff5-8849-7eccefbe2aa1-kube-api-access-ng42l\") pod \"tuned-6q4mf\" (UID: \"14943d57-3e54-4ff5-8849-7eccefbe2aa1\") " pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.179179 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.179156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xb2\" (UniqueName: \"kubernetes.io/projected/a4144120-204d-44b5-9a92-fb19a8e03118-kube-api-access-79xb2\") pod \"node-resolver-chvcc\" (UID: \"a4144120-204d-44b5-9a92-fb19a8e03118\") " pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.253685 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.253654 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jf4lx" Apr 16 16:03:05.261408 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.261384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" Apr 16 16:03:05.271023 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.271003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vshnw" Apr 16 16:03:05.286668 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.286647 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7mgkb" Apr 16 16:03:05.294256 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.294237 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-blk6n" Apr 16 16:03:05.300897 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.300878 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:05.307481 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.307458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:05.314061 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.314041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" Apr 16 16:03:05.319606 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.319587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-chvcc" Apr 16 16:03:05.563678 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.563610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:05.563857 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.563726 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:05.563857 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.563793 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:06.563773303 +0000 UTC m=+4.078811400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:05.664102 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.664069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:05.664280 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.664203 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:05.664280 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.664221 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:05.664280 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.664233 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:05.664452 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:05.664291 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:03:06.664273087 +0000 UTC m=+4.179311180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:05.856455 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.856426 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc6472e_53b0_4010_9c7e_5fae56b32e00.slice/crio-f7e58ebf08fa6d0ef8ff2b27b37936a520b46d19419c3987ad5fb883c49906c6 WatchSource:0}: Error finding container f7e58ebf08fa6d0ef8ff2b27b37936a520b46d19419c3987ad5fb883c49906c6: Status 404 returned error can't find the container with id f7e58ebf08fa6d0ef8ff2b27b37936a520b46d19419c3987ad5fb883c49906c6 Apr 16 16:03:05.858013 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.857950 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14943d57_3e54_4ff5_8849_7eccefbe2aa1.slice/crio-6a93a418a94556392d03dea4b70f880aa319e677c8b772cf84b8a02e26751eaa WatchSource:0}: Error finding container 6a93a418a94556392d03dea4b70f880aa319e677c8b772cf84b8a02e26751eaa: Status 404 returned error can't find the container with id 6a93a418a94556392d03dea4b70f880aa319e677c8b772cf84b8a02e26751eaa Apr 16 16:03:05.860678 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.860650 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4144120_204d_44b5_9a92_fb19a8e03118.slice/crio-8f0d078396b1d58fcb24b51fdca576f752b0809bfebd74e594395f3a902a9640 WatchSource:0}: Error finding container 8f0d078396b1d58fcb24b51fdca576f752b0809bfebd74e594395f3a902a9640: Status 404 returned error can't find the container with id 8f0d078396b1d58fcb24b51fdca576f752b0809bfebd74e594395f3a902a9640 Apr 16 16:03:05.861336 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.861300 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50de6307_e641_41fd_b41a_1f8634f5c208.slice/crio-16f370287dfd1e4e48cc9c0033eaf286d55fb9e3ed97335e17a79930318170bb WatchSource:0}: Error finding container 16f370287dfd1e4e48cc9c0033eaf286d55fb9e3ed97335e17a79930318170bb: Status 404 returned error can't find the container with id 16f370287dfd1e4e48cc9c0033eaf286d55fb9e3ed97335e17a79930318170bb Apr 16 16:03:05.865417 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.865324 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e6fc2e_70a5_41d5_9a11_ce841bf5eabf.slice/crio-d8128afc07f373e8d834c5b1ad235d9ac59d10fce6f197c0c4f69a7f8411d05a WatchSource:0}: Error finding container d8128afc07f373e8d834c5b1ad235d9ac59d10fce6f197c0c4f69a7f8411d05a: Status 404 returned error can't find the container with id d8128afc07f373e8d834c5b1ad235d9ac59d10fce6f197c0c4f69a7f8411d05a Apr 16 16:03:05.866730 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.866707 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadb1352d_acea_4d2f_aff2_10575539bfae.slice/crio-486d652ebf4e1a18f9ff7adb9c18d3d836223ede49f5ee86c4d329886806e235 WatchSource:0}: Error finding container 486d652ebf4e1a18f9ff7adb9c18d3d836223ede49f5ee86c4d329886806e235: Status 404 returned error can't find the container with id 486d652ebf4e1a18f9ff7adb9c18d3d836223ede49f5ee86c4d329886806e235 Apr 16 16:03:05.867553 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.867534 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b105e59_460b_469d_be97_f4653f502e92.slice/crio-381eed767fd33756d48786012af6e79191e0063984e09dd79aeefcc3f75c156f WatchSource:0}: Error finding container 381eed767fd33756d48786012af6e79191e0063984e09dd79aeefcc3f75c156f: Status 404 returned error can't find the container with id 381eed767fd33756d48786012af6e79191e0063984e09dd79aeefcc3f75c156f Apr 16 16:03:05.868591 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.868567 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ab87e7_a174_44d1_a00c_16f49134a9b5.slice/crio-d194e653bcb9721a8c323ef46264d4c554bcf217a6c848c554457542a0be5f32 WatchSource:0}: Error finding container d194e653bcb9721a8c323ef46264d4c554bcf217a6c848c554457542a0be5f32: Status 404 returned error can't find the container with id d194e653bcb9721a8c323ef46264d4c554bcf217a6c848c554457542a0be5f32 Apr 16 16:03:05.869801 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:05.869778 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6c7798_3e64_45d4_88fa_1f044dd7030c.slice/crio-3bec6259134cbd03c109fa703d423c73f2c97fada900842345af4aa8baccb3f7 WatchSource:0}: Error finding container 3bec6259134cbd03c109fa703d423c73f2c97fada900842345af4aa8baccb3f7: Status 404 returned error can't find the container with id 3bec6259134cbd03c109fa703d423c73f2c97fada900842345af4aa8baccb3f7 Apr 16 16:03:05.988050 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.987878 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:58:03 +0000 UTC" deadline="2027-09-21 18:14:37.475851456 +0000 UTC" Apr 16 16:03:05.988050 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:05.988048 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12554h11m31.487807618s" Apr 16 16:03:06.099766 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.099659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" event={"ID":"50de6307-e641-41fd-b41a-1f8634f5c208","Type":"ContainerStarted","Data":"16f370287dfd1e4e48cc9c0033eaf286d55fb9e3ed97335e17a79930318170bb"} Apr 16 16:03:06.100639 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.100601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerStarted","Data":"f7e58ebf08fa6d0ef8ff2b27b37936a520b46d19419c3987ad5fb883c49906c6"} Apr 16 16:03:06.101607 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.101571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vshnw" event={"ID":"5e6c7798-3e64-45d4-88fa-1f044dd7030c","Type":"ContainerStarted","Data":"3bec6259134cbd03c109fa703d423c73f2c97fada900842345af4aa8baccb3f7"} Apr 16 16:03:06.102581 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.102556 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7mgkb" event={"ID":"28ab87e7-a174-44d1-a00c-16f49134a9b5","Type":"ContainerStarted","Data":"d194e653bcb9721a8c323ef46264d4c554bcf217a6c848c554457542a0be5f32"} Apr 16 16:03:06.103626 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.103598 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jf4lx" event={"ID":"adb1352d-acea-4d2f-aff2-10575539bfae","Type":"ContainerStarted","Data":"486d652ebf4e1a18f9ff7adb9c18d3d836223ede49f5ee86c4d329886806e235"} Apr 16 16:03:06.104592 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.104570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"d8128afc07f373e8d834c5b1ad235d9ac59d10fce6f197c0c4f69a7f8411d05a"} Apr 16 16:03:06.105618 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.105597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-chvcc" event={"ID":"a4144120-204d-44b5-9a92-fb19a8e03118","Type":"ContainerStarted","Data":"8f0d078396b1d58fcb24b51fdca576f752b0809bfebd74e594395f3a902a9640"} Apr 16 16:03:06.106638 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.106617 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" event={"ID":"14943d57-3e54-4ff5-8849-7eccefbe2aa1","Type":"ContainerStarted","Data":"6a93a418a94556392d03dea4b70f880aa319e677c8b772cf84b8a02e26751eaa"} Apr 16 16:03:06.108125 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.108106 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" event={"ID":"4a6fddb6306d2ef465913be401643923","Type":"ContainerStarted","Data":"92b9559e568c23514a2927fb0363365de4c6ef803c40732f8c2d4fb46a110589"} Apr 16 16:03:06.109172 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.109156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mpcnj" event={"ID":"6b105e59-460b-469d-be97-f4653f502e92","Type":"ContainerStarted","Data":"381eed767fd33756d48786012af6e79191e0063984e09dd79aeefcc3f75c156f"} Apr 16 16:03:06.128394 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.128350 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-130.ec2.internal" podStartSLOduration=2.1283356700000002 podStartE2EDuration="2.12833567s" podCreationTimestamp="2026-04-16 16:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:06.124539667 +0000 UTC m=+3.639577763" watchObservedRunningTime="2026-04-16 16:03:06.12833567 +0000 UTC m=+3.643373766" Apr 16 16:03:06.573412 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.572843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:06.573412 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:06.572984 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:06.573412 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:06.573047 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:08.573027734 +0000 UTC m=+6.088065810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:06.674126 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:06.673391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:06.674126 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:06.673545 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:06.674126 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:06.673560 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:06.674126 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:06.673570 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:06.674126 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:06.673672 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:03:08.673611408 +0000 UTC m=+6.188649502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:07.104064 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:07.103988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:07.104485 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:07.104125 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:07.104485 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:07.104235 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:07.104485 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:07.104311 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:07.133940 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:07.133899 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ac49a4c0565840b23e57dfca6108f86" containerID="5b122df9c80816679df39d2893aa6039261fd5d9b2275066bcbe91337a7437c1" exitCode=0 Apr 16 16:03:07.134142 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:07.134113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" event={"ID":"2ac49a4c0565840b23e57dfca6108f86","Type":"ContainerDied","Data":"5b122df9c80816679df39d2893aa6039261fd5d9b2275066bcbe91337a7437c1"} Apr 16 16:03:08.143039 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:08.143006 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" event={"ID":"2ac49a4c0565840b23e57dfca6108f86","Type":"ContainerStarted","Data":"2208491131a274f1d9c9a26e79492543f0bbd9de7008c98393182834e8c2a0e2"} Apr 16 16:03:08.590124 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:08.590030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:08.590283 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:08.590245 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:08.590344 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:08.590305 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:12.590286283 +0000 UTC m=+10.105324371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:08.690439 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:08.690406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:08.690644 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:08.690560 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:08.690644 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:08.690581 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:08.690644 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:08.690592 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:08.690644 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:08.690644 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:03:12.690626929 +0000 UTC m=+10.205665009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:09.091447 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:09.091410 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:09.091627 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:09.091598 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:09.092155 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:09.092132 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:09.092278 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:09.092247 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:11.092149 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:11.091648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:11.092149 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:11.091648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:11.092149 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:11.091780 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:11.092149 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:11.091882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:12.623882 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:12.623825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:12.624338 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:12.623953 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:12.624338 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:12.624003 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:20.623990739 +0000 UTC m=+18.139028813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:12.725162 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:12.725123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:12.725349 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:12.725302 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:12.725349 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:12.725321 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:12.725349 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:12.725333 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:12.725501 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:12.725392 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:03:20.72537299 +0000 UTC m=+18.240411069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:13.095559 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:13.093841 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:13.095559 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:13.093977 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:13.095559 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:13.094346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:13.095559 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:13.094427 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:15.091695 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:15.091660 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:15.092117 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:15.091779 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:15.092117 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:15.091836 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:15.092117 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:15.091940 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:17.091582 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:17.090868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:17.091582 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:17.090996 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:17.091582 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:17.091412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:17.091582 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:17.091537 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:19.091356 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:19.091315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:19.091808 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:19.091332 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:19.091959 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:19.091928 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:19.092582 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:19.092552 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:20.682414 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:20.682359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:20.682955 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:20.682545 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:20.682955 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:20.682625 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:36.682604147 +0000 UTC m=+34.197642221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:20.783754 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:20.783718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:20.783935 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:20.783902 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:20.783935 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:20.783924 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:20.783935 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:20.783935 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:20.784072 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:20.783996 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:03:36.783977301 +0000 UTC m=+34.299015376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:21.091130 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:21.091038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:21.091295 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:21.091059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:21.091295 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:21.091193 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:21.091295 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:21.091214 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:23.093467 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:23.093434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:23.093887 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:23.093674 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:23.094196 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:23.094174 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:23.094330 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:23.094291 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:24.179980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:24.179298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vshnw" event={"ID":"5e6c7798-3e64-45d4-88fa-1f044dd7030c","Type":"ContainerStarted","Data":"4c7b94fd28d028b5d5fff0606793683684b185805c98c60b2faab5c05a1ffb22"} Apr 16 16:03:24.191983 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:24.191949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" event={"ID":"14943d57-3e54-4ff5-8849-7eccefbe2aa1","Type":"ContainerStarted","Data":"ba59cc94d760ebd48c06f729674fe43b012ad5cb90b590fa4ca262d32f98c8d5"} Apr 16 16:03:24.194598 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:24.194558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" event={"ID":"50de6307-e641-41fd-b41a-1f8634f5c208","Type":"ContainerStarted","Data":"eedca9662b3f8458d39622a20ab21019251695ac0ee2d1f247de9404c6f149fb"} Apr 16 16:03:24.203907 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:24.203862 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-130.ec2.internal" podStartSLOduration=20.203847319 podStartE2EDuration="20.203847319s" podCreationTimestamp="2026-04-16 16:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:08.169038883 +0000 UTC m=+5.684076981" watchObservedRunningTime="2026-04-16 16:03:24.203847319 +0000 UTC m=+21.718885416" Apr 16 16:03:24.204267 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:24.204228 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vshnw" podStartSLOduration=3.174169136 podStartE2EDuration="21.204219121s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.872961466 +0000 UTC m=+3.387999542" lastFinishedPulling="2026-04-16 16:03:23.903011447 +0000 UTC m=+21.418049527" observedRunningTime="2026-04-16 16:03:24.203083054 +0000 UTC m=+21.718121174" watchObservedRunningTime="2026-04-16 16:03:24.204219121 +0000 UTC m=+21.719257218" Apr 16 16:03:24.251583 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:24.251531 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6q4mf" podStartSLOduration=3.209364898 podStartE2EDuration="21.251492752s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.860896865 +0000 UTC m=+3.375934951" lastFinishedPulling="2026-04-16 16:03:23.903024731 +0000 UTC m=+21.418062805" observedRunningTime="2026-04-16 16:03:24.249563221 +0000 UTC m=+21.764601317" watchObservedRunningTime="2026-04-16 16:03:24.251492752 +0000 UTC m=+21.766530851" Apr 16 16:03:25.091312 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.091135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:25.091431 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.091141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:25.091431 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:25.091384 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:25.091561 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:25.091497 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:25.121348 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.121322 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:03:25.199576 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.199541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-chvcc" event={"ID":"a4144120-204d-44b5-9a92-fb19a8e03118","Type":"ContainerStarted","Data":"a94c333602129297115d0285fb9d0e15a3cffc87063c9c09e90f58c7a80fc770"} Apr 16 16:03:25.200892 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.200869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mpcnj" event={"ID":"6b105e59-460b-469d-be97-f4653f502e92","Type":"ContainerStarted","Data":"dc6ee4ac10263b302bb93aaad3aaba0f5718d08f5b5ba84f688dfabe91e11f4a"} Apr 16 16:03:25.202433 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.202411 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" event={"ID":"50de6307-e641-41fd-b41a-1f8634f5c208","Type":"ContainerStarted","Data":"335ae9fb69a1b5161276541413da7b98e0f7c27666a3ea8637c401ae59676b53"} Apr 16 16:03:25.203776 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.203749 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cc6472e-53b0-4010-9c7e-5fae56b32e00" containerID="2db6e75f1133deea14826b8e1c78695da631234430df96451477ed683afe8bea" exitCode=0 Apr 16 16:03:25.203855 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.203820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerDied","Data":"2db6e75f1133deea14826b8e1c78695da631234430df96451477ed683afe8bea"} Apr 16 16:03:25.205191 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.205163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7mgkb" event={"ID":"28ab87e7-a174-44d1-a00c-16f49134a9b5","Type":"ContainerStarted","Data":"428efbd818b93c691bdff6fd7b4c3781def32f9683f273bdcfda7c786a0ce1bc"} Apr 16 16:03:25.207668 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.207654 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:03:25.207967 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.207950 2577 generic.go:358] "Generic (PLEG): container finished" podID="90e6fc2e-70a5-41d5-9a11-ce841bf5eabf" containerID="13e433d07e08479add77c864a9fe55c12d366a4f7e94fe42d4fc1e1af22dfc57" exitCode=1 Apr 16 16:03:25.208041 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.208024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"0f015910110d91b19a27d693a5ba4571609707de86503e1ddabce4e150b68257"} Apr 16 16:03:25.208082 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.208053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"2c3e4f2ad2da5da3d8a80770e22f67761974bb4205fea2d69e507966efadd5d4"} Apr 16 16:03:25.208082 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.208067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"1b503d460b7dee9dae3b6ea6478c0f7f5da06a2e7fad97d07b618f207ddd0b83"} Apr 16 16:03:25.208082 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.208078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"90622abaa8095dcac7d51b02ba75b3d06ff0967bb77846d8f19de399022da972"} Apr 16 16:03:25.208167 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.208090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"401c6c2ee6034e603d00744b69ae493bec191ea7e5e65cf19803aac2fbdf055e"} Apr 16 16:03:25.208167 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.208098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerDied","Data":"13e433d07e08479add77c864a9fe55c12d366a4f7e94fe42d4fc1e1af22dfc57"} Apr 16 16:03:25.214950 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.214916 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-chvcc" podStartSLOduration=4.176558303 podStartE2EDuration="22.21490472s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.86384792 +0000 UTC m=+3.378886001" lastFinishedPulling="2026-04-16 16:03:23.90219433 +0000 UTC m=+21.417232418" observedRunningTime="2026-04-16 16:03:25.214490473 +0000 UTC m=+22.729528568" watchObservedRunningTime="2026-04-16 16:03:25.21490472 +0000 UTC m=+22.729942815" Apr 16 16:03:25.229867 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.229822 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7mgkb" podStartSLOduration=4.224727056 podStartE2EDuration="22.229807508s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.871989488 +0000 UTC m=+3.387027577" lastFinishedPulling="2026-04-16 16:03:23.877069949 +0000 UTC m=+21.392108029" observedRunningTime="2026-04-16 16:03:25.229607274 +0000 UTC m=+22.744645372" watchObservedRunningTime="2026-04-16 16:03:25.229807508 +0000 UTC m=+22.744845603" Apr 16 16:03:25.270209 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:25.268738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mpcnj" podStartSLOduration=4.388604641 podStartE2EDuration="22.268720513s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.871866289 +0000 UTC m=+3.386904366" lastFinishedPulling="2026-04-16 16:03:23.751982162 +0000 UTC m=+21.267020238" observedRunningTime="2026-04-16 16:03:25.268011458 +0000 UTC m=+22.783049555" watchObservedRunningTime="2026-04-16 16:03:25.268720513 +0000 UTC m=+22.783758610" Apr 16 16:03:26.013829 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.013725 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:03:25.12134368Z","UUID":"3f9f0b89-aeb9-4a20-b36d-93342484501f","Handler":null,"Name":"","Endpoint":""} Apr 16 16:03:26.015878 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.015852 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:03:26.016012 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.015886 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:03:26.212423 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.212380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" event={"ID":"50de6307-e641-41fd-b41a-1f8634f5c208","Type":"ContainerStarted","Data":"427f879c9f0512aeb89ef464039aa95f18fe3035b094bbe2d50bcbd08ba36120"} Apr 16 16:03:26.213848 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.213819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jf4lx" event={"ID":"adb1352d-acea-4d2f-aff2-10575539bfae","Type":"ContainerStarted","Data":"9b950c34ad0dec9d75b50bb5bb322e0c4b85226970cba270cac1f11dee3c1118"} Apr 16 16:03:26.231263 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.231217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cghg5" podStartSLOduration=3.244711031 podStartE2EDuration="23.231203033s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.863158244 +0000 UTC m=+3.378196320" lastFinishedPulling="2026-04-16 16:03:25.849650239 +0000 UTC m=+23.364688322" observedRunningTime="2026-04-16 16:03:26.231077706 +0000 UTC m=+23.746115802" watchObservedRunningTime="2026-04-16 16:03:26.231203033 +0000 UTC m=+23.746241131" Apr 16 16:03:26.247618 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:26.247576 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jf4lx" podStartSLOduration=5.211395908 podStartE2EDuration="23.247562146s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.870882862 +0000 UTC m=+3.385920943" lastFinishedPulling="2026-04-16 16:03:23.907049106 +0000 UTC m=+21.422087181" observedRunningTime="2026-04-16 16:03:26.246994991 +0000 UTC m=+23.762033088" watchObservedRunningTime="2026-04-16 16:03:26.247562146 +0000 UTC m=+23.762600242" Apr 16 16:03:27.091933 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:27.091683 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:27.092123 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:27.091716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:27.092123 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:27.092057 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:27.092123 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:27.092102 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:27.219582 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:27.219554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:03:27.220046 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:27.219883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"b5990fb32557ff2f95b1b7a281244f2d69badd2ee67db5292357f1c993343bd7"} Apr 16 16:03:27.224519 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:27.224489 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:27.225130 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:27.225086 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:28.221331 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:28.221303 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:28.221927 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:28.221901 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mpcnj" Apr 16 16:03:29.091229 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:29.091153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:29.091410 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:29.091158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:29.091410 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:29.091300 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:29.091410 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:29.091383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:30.227920 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.227752 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:03:30.228672 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.228247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"dcb206fa4eac3badf2498c8032af35c13754846bef329cd2c9ebc6b86ff00bac"} Apr 16 16:03:30.228672 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.228541 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:30.228783 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.228727 2577 scope.go:117] "RemoveContainer" containerID="13e433d07e08479add77c864a9fe55c12d366a4f7e94fe42d4fc1e1af22dfc57" Apr 16 16:03:30.230129 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.230105 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cc6472e-53b0-4010-9c7e-5fae56b32e00" containerID="6b1263905b2633525a2c86b5d60396fb7a5a8acb0a92e8656865677f57ae6f6f" exitCode=0 Apr 16 16:03:30.230229 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.230167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerDied","Data":"6b1263905b2633525a2c86b5d60396fb7a5a8acb0a92e8656865677f57ae6f6f"} Apr 16 16:03:30.245120 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:30.245101 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:31.090941 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.090798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:31.090941 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.090848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:31.091139 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:31.090942 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:31.091139 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:31.091035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:31.234746 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.234675 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:03:31.235149 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.235030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" event={"ID":"90e6fc2e-70a5-41d5-9a11-ce841bf5eabf","Type":"ContainerStarted","Data":"b6d0c9a13f8571ea11840772a8bbb516cde45e0a5ac3c6e9e9a6dc5b3f5cdca6"} Apr 16 16:03:31.235324 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.235302 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:31.235396 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.235334 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:31.237236 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.237215 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cc6472e-53b0-4010-9c7e-5fae56b32e00" containerID="876fc48be6c34e8a35d987a76ae893649ade65bae5cba9fc138b2bef97a154d1" exitCode=0 Apr 16 16:03:31.237343 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.237244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerDied","Data":"876fc48be6c34e8a35d987a76ae893649ade65bae5cba9fc138b2bef97a154d1"} Apr 16 16:03:31.251358 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.251175 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:03:31.268695 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.268659 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" podStartSLOduration=10.061364738 podStartE2EDuration="28.268646407s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.867491156 +0000 UTC m=+3.382529231" lastFinishedPulling="2026-04-16 16:03:24.074772821 +0000 UTC m=+21.589810900" observedRunningTime="2026-04-16 16:03:31.267285944 +0000 UTC m=+28.782324019" watchObservedRunningTime="2026-04-16 16:03:31.268646407 +0000 UTC m=+28.783684503" Apr 16 16:03:31.329247 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.329217 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2mqsw"] Apr 16 16:03:31.329396 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.329343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:31.329453 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:31.329432 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:31.341869 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.341834 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5n846"] Apr 16 16:03:31.341995 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:31.341967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:31.342092 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:31.342069 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:32.241008 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:32.240974 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cc6472e-53b0-4010-9c7e-5fae56b32e00" containerID="381d17719d81929059b22d5b2a8e2a6fd33c840b04311362dd8af89e9ea0451e" exitCode=0 Apr 16 16:03:32.241384 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:32.241042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerDied","Data":"381d17719d81929059b22d5b2a8e2a6fd33c840b04311362dd8af89e9ea0451e"} Apr 16 16:03:33.092486 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:33.092409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:33.092486 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:33.092458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:33.092757 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:33.092547 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:33.092757 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:33.092603 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:35.091189 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:35.091145 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:35.092055 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:35.091276 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5n846" podUID="52215440-c220-46dc-927b-72ff3dad940a" Apr 16 16:03:35.092055 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:35.091332 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:35.092055 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:35.091457 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:03:36.697203 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.697169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:36.697767 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:36.697306 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:36.697767 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:36.697366 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:04:08.697351198 +0000 UTC m=+66.212389274 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:36.797646 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.797603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:36.797841 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:36.797766 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:36.797841 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:36.797785 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:36.797841 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:36.797798 2577 projected.go:194] Error preparing data for projected volume kube-api-access-w6cpp for pod openshift-network-diagnostics/network-check-target-5n846: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:36.798004 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:36.797869 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp podName:52215440-c220-46dc-927b-72ff3dad940a nodeName:}" failed. No retries permitted until 2026-04-16 16:04:08.797849654 +0000 UTC m=+66.312887745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6cpp" (UniqueName: "kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp") pod "network-check-target-5n846" (UID: "52215440-c220-46dc-927b-72ff3dad940a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:36.854584 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.854556 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-130.ec2.internal" event="NodeReady" Apr 16 16:03:36.854757 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.854685 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:03:36.933152 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.933074 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hccnt"] Apr 16 16:03:36.963535 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.963474 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b8qrw"] Apr 16 16:03:36.963677 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.963641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:36.968271 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.968247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:03:36.968271 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.968266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jhn24\"" Apr 16 16:03:36.968465 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.968269 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:03:36.982389 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.982362 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hccnt"] Apr 16 16:03:36.982389 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.982385 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8qrw"] Apr 16 16:03:36.982606 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.982473 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:36.985208 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.985187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:03:36.985748 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.985569 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:03:36.985748 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.985635 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l2qxk\"" Apr 16 16:03:36.986113 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:36.986084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:03:37.091236 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.091203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:03:37.091427 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.091203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:03:37.093833 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.093809 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:37.093956 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.093841 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hwq6n\"" Apr 16 16:03:37.093956 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.093887 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:37.094128 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.094110 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:37.094197 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.094160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qf27n\"" Apr 16 16:03:37.099851 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.099827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:37.099975 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.099918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.099975 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.099960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5434c75-d025-4504-969f-7f577a43a937-tmp-dir\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.100092 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.099992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5434c75-d025-4504-969f-7f577a43a937-config-volume\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.100092 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.100036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmndt\" (UniqueName: \"kubernetes.io/projected/b5434c75-d025-4504-969f-7f577a43a937-kube-api-access-mmndt\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.100194 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.100097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c48q\" (UniqueName: \"kubernetes.io/projected/efb959fd-8419-441a-a0a0-4d727132d9de-kube-api-access-7c48q\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:37.200549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.200444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c48q\" (UniqueName: \"kubernetes.io/projected/efb959fd-8419-441a-a0a0-4d727132d9de-kube-api-access-7c48q\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:37.200549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.200496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:37.200773 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.200612 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:37.200773 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.200684 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:03:37.70066665 +0000 UTC m=+35.215704733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:03:37.200773 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.200704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.200773 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.200731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5434c75-d025-4504-969f-7f577a43a937-tmp-dir\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.200954 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.200784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5434c75-d025-4504-969f-7f577a43a937-config-volume\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.200954 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.200814 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:37.200954 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.200900 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:37.700879169 +0000 UTC m=+35.215917257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:03:37.200954 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.200814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmndt\" (UniqueName: \"kubernetes.io/projected/b5434c75-d025-4504-969f-7f577a43a937-kube-api-access-mmndt\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.201142 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.201093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5434c75-d025-4504-969f-7f577a43a937-tmp-dir\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.201381 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.201363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5434c75-d025-4504-969f-7f577a43a937-config-volume\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.213243 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.213216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmndt\" (UniqueName: \"kubernetes.io/projected/b5434c75-d025-4504-969f-7f577a43a937-kube-api-access-mmndt\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.213399 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.213381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c48q\" (UniqueName: \"kubernetes.io/projected/efb959fd-8419-441a-a0a0-4d727132d9de-kube-api-access-7c48q\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:37.704403 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.704165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:37.704827 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:37.704450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:37.704827 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.704334 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:37.704827 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.704587 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:03:38.704565129 +0000 UTC m=+36.219603226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:03:37.704827 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.704602 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:37.704827 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:37.704669 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:38.704651594 +0000 UTC m=+36.219689672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:03:38.711302 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:38.711250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:38.711750 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:38.711347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:38.711750 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:38.711401 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:38.711750 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:38.711442 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:38.711750 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:38.711471 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:40.711454937 +0000 UTC m=+38.226493011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:03:38.711750 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:38.711494 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:03:40.711478624 +0000 UTC m=+38.226516712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:03:39.257122 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:39.257092 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cc6472e-53b0-4010-9c7e-5fae56b32e00" containerID="01f09dc9a5149cc18cf063eba6cb8f7d11d12761a69893c20507c0a99465f031" exitCode=0 Apr 16 16:03:39.257285 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:39.257136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerDied","Data":"01f09dc9a5149cc18cf063eba6cb8f7d11d12761a69893c20507c0a99465f031"} Apr 16 16:03:40.261322 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:40.261285 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cc6472e-53b0-4010-9c7e-5fae56b32e00" containerID="e63498dee51c49e2ec339084764be2e3124859d0821b9fe21a8e80430687db6e" exitCode=0 Apr 16 16:03:40.261789 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:40.261352 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerDied","Data":"e63498dee51c49e2ec339084764be2e3124859d0821b9fe21a8e80430687db6e"} Apr 16 16:03:40.727123 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:40.727085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:40.727301 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:40.727135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:40.727301 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:40.727241 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:40.727301 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:40.727244 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:40.727301 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:40.727300 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:44.72728689 +0000 UTC m=+42.242324965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:03:40.727440 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:40.727313 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:03:44.727308052 +0000 UTC m=+42.242346125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:03:41.265908 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:41.265882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blk6n" event={"ID":"4cc6472e-53b0-4010-9c7e-5fae56b32e00","Type":"ContainerStarted","Data":"94471ab0c4167cfbb72cf276dd11e6e1573be23476ad473a2c95a1c8fdda62ec"} Apr 16 16:03:41.291062 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:41.291010 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-blk6n" podStartSLOduration=5.943282777 podStartE2EDuration="38.290996399s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:05.858847385 +0000 UTC m=+3.373885465" lastFinishedPulling="2026-04-16 16:03:38.206560999 +0000 UTC m=+35.721599087" observedRunningTime="2026-04-16 16:03:41.289910208 +0000 UTC m=+38.804948304" watchObservedRunningTime="2026-04-16 16:03:41.290996399 +0000 UTC m=+38.806034495" Apr 16 16:03:44.753184 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:44.753147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:44.753628 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:44.753212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:44.753628 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:44.753299 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:44.753628 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:44.753303 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:44.753628 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:44.753356 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:03:52.753340425 +0000 UTC m=+50.268378499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:03:44.753628 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:44.753368 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:52.753362696 +0000 UTC m=+50.268400770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:03:52.542951 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.542915 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx"] Apr 16 16:03:52.577012 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.576972 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx"] Apr 16 16:03:52.577012 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.577007 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4"] Apr 16 16:03:52.577215 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.577104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.580416 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.580392 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vscwm\"" Apr 16 16:03:52.580416 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.580393 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:03:52.581906 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.581891 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:03:52.582025 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.581930 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:03:52.582354 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.582309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:03:52.594876 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.594810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4"] Apr 16 16:03:52.594975 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.594927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.597465 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.597445 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:03:52.597920 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.597906 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:03:52.597974 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.597937 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:03:52.598481 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.598462 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:03:52.708681 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.708843 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d759bb16-79f7-47eb-93fd-14305f8a850f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.708843 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9x5\" (UniqueName: \"kubernetes.io/projected/d759bb16-79f7-47eb-93fd-14305f8a850f-kube-api-access-xm9x5\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.708843 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.708843 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-ca\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.708976 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptg2m\" (UniqueName: \"kubernetes.io/projected/8512020c-1ec9-4924-b45c-1368fce88a49-kube-api-access-ptg2m\") pod \"managed-serviceaccount-addon-agent-6bb999cf8c-zscrx\" (UID: \"8512020c-1ec9-4924-b45c-1368fce88a49\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.708976 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8512020c-1ec9-4924-b45c-1368fce88a49-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bb999cf8c-zscrx\" (UID: \"8512020c-1ec9-4924-b45c-1368fce88a49\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.708976 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.708967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-hub\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.809998 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.809927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.810120 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-ca\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.810120 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptg2m\" (UniqueName: \"kubernetes.io/projected/8512020c-1ec9-4924-b45c-1368fce88a49-kube-api-access-ptg2m\") pod \"managed-serviceaccount-addon-agent-6bb999cf8c-zscrx\" (UID: \"8512020c-1ec9-4924-b45c-1368fce88a49\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.810186 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8512020c-1ec9-4924-b45c-1368fce88a49-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bb999cf8c-zscrx\" (UID: \"8512020c-1ec9-4924-b45c-1368fce88a49\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.810223 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:03:52.810256 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-hub\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.810256 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.810325 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:03:52.810380 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:52.810342 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:52.810380 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:52.810365 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:52.810468 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:52.810419 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:04:08.810398628 +0000 UTC m=+66.325436714 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:03:52.810468 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:03:52.810438 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:08.810428056 +0000 UTC m=+66.325466145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:03:52.810813 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d759bb16-79f7-47eb-93fd-14305f8a850f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.810950 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.810829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9x5\" (UniqueName: \"kubernetes.io/projected/d759bb16-79f7-47eb-93fd-14305f8a850f-kube-api-access-xm9x5\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.811983 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.811951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d759bb16-79f7-47eb-93fd-14305f8a850f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.813965 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.813932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.814082 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.814047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.814169 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.814150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-ca\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.814246 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.814230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8512020c-1ec9-4924-b45c-1368fce88a49-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6bb999cf8c-zscrx\" (UID: \"8512020c-1ec9-4924-b45c-1368fce88a49\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.814311 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.814293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d759bb16-79f7-47eb-93fd-14305f8a850f-hub\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.822734 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.822713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9x5\" (UniqueName: \"kubernetes.io/projected/d759bb16-79f7-47eb-93fd-14305f8a850f-kube-api-access-xm9x5\") pod \"cluster-proxy-proxy-agent-68f544d458-mn4b4\" (UID: \"d759bb16-79f7-47eb-93fd-14305f8a850f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:52.824254 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.824232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptg2m\" (UniqueName: \"kubernetes.io/projected/8512020c-1ec9-4924-b45c-1368fce88a49-kube-api-access-ptg2m\") pod \"managed-serviceaccount-addon-agent-6bb999cf8c-zscrx\" (UID: \"8512020c-1ec9-4924-b45c-1368fce88a49\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.893099 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.893064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" Apr 16 16:03:52.902873 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:52.902848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:03:53.062881 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:53.062798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx"] Apr 16 16:03:53.067235 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:53.067211 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8512020c_1ec9_4924_b45c_1368fce88a49.slice/crio-fb349abcc49d37e848dd22835eac1e87c7c48ec2fb35ed831b504814b96efd4c WatchSource:0}: Error finding container fb349abcc49d37e848dd22835eac1e87c7c48ec2fb35ed831b504814b96efd4c: Status 404 returned error can't find the container with id fb349abcc49d37e848dd22835eac1e87c7c48ec2fb35ed831b504814b96efd4c Apr 16 16:03:53.076278 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:53.076253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4"] Apr 16 16:03:53.079405 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:03:53.079383 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd759bb16_79f7_47eb_93fd_14305f8a850f.slice/crio-0820c6c1f7db452528b2e7904bb73c3fa781dfbb9edd4d97f96c07b24b94f5ea WatchSource:0}: Error finding container 0820c6c1f7db452528b2e7904bb73c3fa781dfbb9edd4d97f96c07b24b94f5ea: Status 404 returned error can't find the container with id 0820c6c1f7db452528b2e7904bb73c3fa781dfbb9edd4d97f96c07b24b94f5ea Apr 16 16:03:53.288926 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:53.288891 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" event={"ID":"8512020c-1ec9-4924-b45c-1368fce88a49","Type":"ContainerStarted","Data":"fb349abcc49d37e848dd22835eac1e87c7c48ec2fb35ed831b504814b96efd4c"} Apr 16 16:03:53.289688 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:53.289656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" event={"ID":"d759bb16-79f7-47eb-93fd-14305f8a850f","Type":"ContainerStarted","Data":"0820c6c1f7db452528b2e7904bb73c3fa781dfbb9edd4d97f96c07b24b94f5ea"} Apr 16 16:03:57.299538 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:57.299492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" event={"ID":"8512020c-1ec9-4924-b45c-1368fce88a49","Type":"ContainerStarted","Data":"034cb9ba869b23bb7e927ae46613d2be7de90fd19c07b269839179dc9513b9cd"} Apr 16 16:03:57.320413 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:03:57.320361 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" podStartSLOduration=2.153745088 podStartE2EDuration="5.320346625s" podCreationTimestamp="2026-04-16 16:03:52 +0000 UTC" firstStartedPulling="2026-04-16 16:03:53.069893391 +0000 UTC m=+50.584931464" lastFinishedPulling="2026-04-16 16:03:56.236494926 +0000 UTC m=+53.751533001" observedRunningTime="2026-04-16 16:03:57.31928983 +0000 UTC m=+54.834327925" watchObservedRunningTime="2026-04-16 16:03:57.320346625 +0000 UTC m=+54.835384720" Apr 16 16:04:03.254275 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:03.254248 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h6xk8" Apr 16 16:04:07.325704 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:07.325662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" event={"ID":"d759bb16-79f7-47eb-93fd-14305f8a850f","Type":"ContainerStarted","Data":"e9d87a35249a3a7d5151f78b20b9d4a7f51dd6fc503707f9293338c746e1bd75"} Apr 16 16:04:08.730365 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.730326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:04:08.732931 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.732907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:04:08.740898 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:08.740869 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:04:08.741025 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:08.740947 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:05:12.740924934 +0000 UTC m=+130.255963011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : secret "metrics-daemon-secret" not found Apr 16 16:04:08.831285 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.831239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:04:08.831285 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.831287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:04:08.831497 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.831337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:04:08.831497 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:08.831421 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:08.831497 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:08.831462 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:08.831497 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:08.831486 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:04:40.831467847 +0000 UTC m=+98.346505920 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:04:08.831673 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:08.831533 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:40.83149867 +0000 UTC m=+98.346536744 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:04:08.834102 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.834084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:04:08.844446 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.844423 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:04:08.855095 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.855068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6cpp\" (UniqueName: \"kubernetes.io/projected/52215440-c220-46dc-927b-72ff3dad940a-kube-api-access-w6cpp\") pod \"network-check-target-5n846\" (UID: \"52215440-c220-46dc-927b-72ff3dad940a\") " pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:04:08.906065 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.906021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qf27n\"" Apr 16 16:04:08.913237 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:08.913207 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:04:09.549979 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:09.549946 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5n846"] Apr 16 16:04:09.553035 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:04:09.553002 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52215440_c220_46dc_927b_72ff3dad940a.slice/crio-b3df79ef8ede88bfd27bcb1cb80575e3cd86b598cb0742585c0e1064b0792385 WatchSource:0}: Error finding container b3df79ef8ede88bfd27bcb1cb80575e3cd86b598cb0742585c0e1064b0792385: Status 404 returned error can't find the container with id b3df79ef8ede88bfd27bcb1cb80575e3cd86b598cb0742585c0e1064b0792385 Apr 16 16:04:10.333857 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:10.333759 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5n846" event={"ID":"52215440-c220-46dc-927b-72ff3dad940a","Type":"ContainerStarted","Data":"b3df79ef8ede88bfd27bcb1cb80575e3cd86b598cb0742585c0e1064b0792385"} Apr 16 16:04:10.336083 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:10.336051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" event={"ID":"d759bb16-79f7-47eb-93fd-14305f8a850f","Type":"ContainerStarted","Data":"65ef498676a2c384738bce47dce65e25c3630e06f89b07dcb4150ade6348941f"} Apr 16 16:04:10.336083 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:10.336088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" event={"ID":"d759bb16-79f7-47eb-93fd-14305f8a850f","Type":"ContainerStarted","Data":"3ed088ef20ffd3b006603542f34379475b233c307175b0dd822dcddc426d30f7"} Apr 16 16:04:10.358376 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:10.358321 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" podStartSLOduration=1.550884441 podStartE2EDuration="18.358305511s" podCreationTimestamp="2026-04-16 16:03:52 +0000 UTC" firstStartedPulling="2026-04-16 16:03:53.081059172 +0000 UTC m=+50.596097247" lastFinishedPulling="2026-04-16 16:04:09.888480232 +0000 UTC m=+67.403518317" observedRunningTime="2026-04-16 16:04:10.357730205 +0000 UTC m=+67.872768323" watchObservedRunningTime="2026-04-16 16:04:10.358305511 +0000 UTC m=+67.873343605" Apr 16 16:04:13.343673 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:13.343635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5n846" event={"ID":"52215440-c220-46dc-927b-72ff3dad940a","Type":"ContainerStarted","Data":"4c05ce738168c49e1801049ff268754db41af89c945564b16f2a31ebe0e38b67"} Apr 16 16:04:13.344044 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:13.343786 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:04:13.360472 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:13.360425 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5n846" podStartSLOduration=67.599934997 podStartE2EDuration="1m10.360410384s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:04:09.554977315 +0000 UTC m=+67.070015389" lastFinishedPulling="2026-04-16 16:04:12.315452698 +0000 UTC m=+69.830490776" observedRunningTime="2026-04-16 16:04:13.359659144 +0000 UTC m=+70.874697240" watchObservedRunningTime="2026-04-16 16:04:13.360410384 +0000 UTC m=+70.875448480" Apr 16 16:04:40.855262 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:40.855110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:04:40.855262 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:40.855161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:04:40.855818 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:40.855266 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:40.855818 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:40.855277 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:40.855818 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:40.855322 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls podName:b5434c75-d025-4504-969f-7f577a43a937 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:44.855309668 +0000 UTC m=+162.370347742 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls") pod "dns-default-hccnt" (UID: "b5434c75-d025-4504-969f-7f577a43a937") : secret "dns-default-metrics-tls" not found Apr 16 16:04:40.855818 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:04:40.855348 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert podName:efb959fd-8419-441a-a0a0-4d727132d9de nodeName:}" failed. No retries permitted until 2026-04-16 16:05:44.855329885 +0000 UTC m=+162.370367958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert") pod "ingress-canary-b8qrw" (UID: "efb959fd-8419-441a-a0a0-4d727132d9de") : secret "canary-serving-cert" not found Apr 16 16:04:44.349248 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:04:44.349214 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5n846" Apr 16 16:05:12.779799 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:12.779740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:05:12.780294 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:05:12.779891 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:05:12.780294 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:05:12.779964 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs podName:dd5274ed-46c8-46c2-a74e-26859678b08d nodeName:}" failed. No retries permitted until 2026-04-16 16:07:14.77994821 +0000 UTC m=+252.294986289 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs") pod "network-metrics-daemon-2mqsw" (UID: "dd5274ed-46c8-46c2-a74e-26859678b08d") : secret "metrics-daemon-secret" not found Apr 16 16:05:24.877310 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:24.877281 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-chvcc_a4144120-204d-44b5-9a92-fb19a8e03118/dns-node-resolver/0.log" Apr 16 16:05:25.677984 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:25.677956 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7mgkb_28ab87e7-a174-44d1-a00c-16f49134a9b5/node-ca/0.log" Apr 16 16:05:39.974694 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:05:39.974636 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hccnt" podUID="b5434c75-d025-4504-969f-7f577a43a937" Apr 16 16:05:39.991889 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:05:39.991860 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b8qrw" podUID="efb959fd-8419-441a-a0a0-4d727132d9de" Apr 16 16:05:40.107994 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:05:40.107935 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2mqsw" podUID="dd5274ed-46c8-46c2-a74e-26859678b08d" Apr 16 16:05:40.541897 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:40.541868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hccnt" Apr 16 16:05:44.009237 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.009205 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gbzsb"] Apr 16 16:05:44.012332 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.012314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.016406 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.016383 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:05:44.017629 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.017598 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:05:44.017629 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.017620 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8s6r8\"" Apr 16 16:05:44.017810 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.017640 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:05:44.017810 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.017671 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:05:44.030052 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.030026 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gbzsb"] Apr 16 16:05:44.106857 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.106828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6886fe81-08eb-4098-9867-798c7f5e4257-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.107017 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.106871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9r6h\" (UniqueName: \"kubernetes.io/projected/6886fe81-08eb-4098-9867-798c7f5e4257-kube-api-access-l9r6h\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.107017 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.106967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6886fe81-08eb-4098-9867-798c7f5e4257-data-volume\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.107017 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.107006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6886fe81-08eb-4098-9867-798c7f5e4257-crio-socket\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.107147 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.107027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6886fe81-08eb-4098-9867-798c7f5e4257-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.207865 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.207820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6886fe81-08eb-4098-9867-798c7f5e4257-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208057 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.207873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9r6h\" (UniqueName: \"kubernetes.io/projected/6886fe81-08eb-4098-9867-798c7f5e4257-kube-api-access-l9r6h\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208057 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.207929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6886fe81-08eb-4098-9867-798c7f5e4257-data-volume\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208214 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.208183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6886fe81-08eb-4098-9867-798c7f5e4257-crio-socket\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208266 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.208233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6886fe81-08eb-4098-9867-798c7f5e4257-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208317 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.208286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6886fe81-08eb-4098-9867-798c7f5e4257-data-volume\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208317 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.208291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6886fe81-08eb-4098-9867-798c7f5e4257-crio-socket\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.208688 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.208669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6886fe81-08eb-4098-9867-798c7f5e4257-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.210291 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.210275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6886fe81-08eb-4098-9867-798c7f5e4257-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.226107 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.226082 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9r6h\" (UniqueName: \"kubernetes.io/projected/6886fe81-08eb-4098-9867-798c7f5e4257-kube-api-access-l9r6h\") pod \"insights-runtime-extractor-gbzsb\" (UID: \"6886fe81-08eb-4098-9867-798c7f5e4257\") " pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.324689 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.324605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gbzsb" Apr 16 16:05:44.457937 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.457911 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gbzsb"] Apr 16 16:05:44.461783 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:05:44.461748 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6886fe81_08eb_4098_9867_798c7f5e4257.slice/crio-d62461249097c5fcad4480bba514f3fd4114b8d9e1ca648ff2c07fd72b5659e3 WatchSource:0}: Error finding container d62461249097c5fcad4480bba514f3fd4114b8d9e1ca648ff2c07fd72b5659e3: Status 404 returned error can't find the container with id d62461249097c5fcad4480bba514f3fd4114b8d9e1ca648ff2c07fd72b5659e3 Apr 16 16:05:44.551977 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.551943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gbzsb" event={"ID":"6886fe81-08eb-4098-9867-798c7f5e4257","Type":"ContainerStarted","Data":"969a4a397ae1e7f70bdf99004a2607d68d632b93ab8c60b18d441193147a4eeb"} Apr 16 16:05:44.551977 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.551977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gbzsb" event={"ID":"6886fe81-08eb-4098-9867-798c7f5e4257","Type":"ContainerStarted","Data":"d62461249097c5fcad4480bba514f3fd4114b8d9e1ca648ff2c07fd72b5659e3"} Apr 16 16:05:44.913403 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.913364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:05:44.913403 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.913413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:05:44.915873 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.915841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b5434c75-d025-4504-969f-7f577a43a937-metrics-tls\") pod \"dns-default-hccnt\" (UID: \"b5434c75-d025-4504-969f-7f577a43a937\") " pod="openshift-dns/dns-default-hccnt" Apr 16 16:05:44.915997 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:44.915978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efb959fd-8419-441a-a0a0-4d727132d9de-cert\") pod \"ingress-canary-b8qrw\" (UID: \"efb959fd-8419-441a-a0a0-4d727132d9de\") " pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:05:45.045054 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:45.045025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jhn24\"" Apr 16 16:05:45.053097 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:45.053075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hccnt" Apr 16 16:05:45.188763 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:45.188681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hccnt"] Apr 16 16:05:45.191625 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:05:45.191600 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5434c75_d025_4504_969f_7f577a43a937.slice/crio-048c408a1e33dc6b768d7047f5aa54fdb499309eaaa9997dc3f8cb844fff07bd WatchSource:0}: Error finding container 048c408a1e33dc6b768d7047f5aa54fdb499309eaaa9997dc3f8cb844fff07bd: Status 404 returned error can't find the container with id 048c408a1e33dc6b768d7047f5aa54fdb499309eaaa9997dc3f8cb844fff07bd Apr 16 16:05:45.556436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:45.556402 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gbzsb" event={"ID":"6886fe81-08eb-4098-9867-798c7f5e4257","Type":"ContainerStarted","Data":"5b7a4eff672d704e7857b87641122807d99d3acd1383d574979fe32b1a126cd8"} Apr 16 16:05:45.557330 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:45.557306 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hccnt" event={"ID":"b5434c75-d025-4504-969f-7f577a43a937","Type":"ContainerStarted","Data":"048c408a1e33dc6b768d7047f5aa54fdb499309eaaa9997dc3f8cb844fff07bd"} Apr 16 16:05:47.564400 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:47.564361 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gbzsb" event={"ID":"6886fe81-08eb-4098-9867-798c7f5e4257","Type":"ContainerStarted","Data":"1e76c511ad49057b49f91d17b948047a45f1b2a1c7f14549db27f85c24c9c733"} Apr 16 16:05:47.565952 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:47.565923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hccnt" event={"ID":"b5434c75-d025-4504-969f-7f577a43a937","Type":"ContainerStarted","Data":"e588fecb5a486d9b0ac4fe46e4260bdfbc1ec99a6e63945b01235d5a926f9844"} Apr 16 16:05:47.566055 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:47.565958 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hccnt" event={"ID":"b5434c75-d025-4504-969f-7f577a43a937","Type":"ContainerStarted","Data":"a531794d8caaba0fc5218f300581c83c9050e7f865b592b9701af4a56438c297"} Apr 16 16:05:47.566055 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:47.566039 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hccnt" Apr 16 16:05:47.602793 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:47.602737 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hccnt" podStartSLOduration=130.303410205 podStartE2EDuration="2m11.60272253s" podCreationTimestamp="2026-04-16 16:03:36 +0000 UTC" firstStartedPulling="2026-04-16 16:05:45.193343145 +0000 UTC m=+162.708381219" lastFinishedPulling="2026-04-16 16:05:46.492655461 +0000 UTC m=+164.007693544" observedRunningTime="2026-04-16 16:05:47.602453195 +0000 UTC m=+165.117491304" watchObservedRunningTime="2026-04-16 16:05:47.60272253 +0000 UTC m=+165.117760604" Apr 16 16:05:47.603019 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:47.602995 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gbzsb" podStartSLOduration=2.090146027 podStartE2EDuration="4.60298964s" podCreationTimestamp="2026-04-16 16:05:43 +0000 UTC" firstStartedPulling="2026-04-16 16:05:44.512771704 +0000 UTC m=+162.027809777" lastFinishedPulling="2026-04-16 16:05:47.025615312 +0000 UTC m=+164.540653390" observedRunningTime="2026-04-16 16:05:47.583530115 +0000 UTC m=+165.098568202" watchObservedRunningTime="2026-04-16 16:05:47.60298964 +0000 UTC m=+165.118027735" Apr 16 16:05:52.091698 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:52.091661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:05:52.094633 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:52.094614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l2qxk\"" Apr 16 16:05:52.102714 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:52.102698 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8qrw" Apr 16 16:05:52.222430 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:52.222398 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8qrw"] Apr 16 16:05:52.225520 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:05:52.225482 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb959fd_8419_441a_a0a0_4d727132d9de.slice/crio-eb193c1a951add0d5650ad9ddf41f67c195207be9e9d791e8911aa317578dc88 WatchSource:0}: Error finding container eb193c1a951add0d5650ad9ddf41f67c195207be9e9d791e8911aa317578dc88: Status 404 returned error can't find the container with id eb193c1a951add0d5650ad9ddf41f67c195207be9e9d791e8911aa317578dc88 Apr 16 16:05:52.580181 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:52.580141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8qrw" event={"ID":"efb959fd-8419-441a-a0a0-4d727132d9de","Type":"ContainerStarted","Data":"eb193c1a951add0d5650ad9ddf41f67c195207be9e9d791e8911aa317578dc88"} Apr 16 16:05:53.093074 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:53.093041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:05:54.587588 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:54.587554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8qrw" event={"ID":"efb959fd-8419-441a-a0a0-4d727132d9de","Type":"ContainerStarted","Data":"3381647aa0db0cb76741e4dfdf2f50828d4f9b05075437dda27e46436f4f4b74"} Apr 16 16:05:54.608195 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:54.608146 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b8qrw" podStartSLOduration=137.218056538 podStartE2EDuration="2m18.608133132s" podCreationTimestamp="2026-04-16 16:03:36 +0000 UTC" firstStartedPulling="2026-04-16 16:05:52.227753792 +0000 UTC m=+169.742791869" lastFinishedPulling="2026-04-16 16:05:53.617830386 +0000 UTC m=+171.132868463" observedRunningTime="2026-04-16 16:05:54.60681158 +0000 UTC m=+172.121849674" watchObservedRunningTime="2026-04-16 16:05:54.608133132 +0000 UTC m=+172.123171298" Apr 16 16:05:56.593638 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:56.593605 2577 generic.go:358] "Generic (PLEG): container finished" podID="8512020c-1ec9-4924-b45c-1368fce88a49" containerID="034cb9ba869b23bb7e927ae46613d2be7de90fd19c07b269839179dc9513b9cd" exitCode=255 Apr 16 16:05:56.594038 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:56.593648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" event={"ID":"8512020c-1ec9-4924-b45c-1368fce88a49","Type":"ContainerDied","Data":"034cb9ba869b23bb7e927ae46613d2be7de90fd19c07b269839179dc9513b9cd"} Apr 16 16:05:56.594038 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:56.593924 2577 scope.go:117] "RemoveContainer" containerID="034cb9ba869b23bb7e927ae46613d2be7de90fd19c07b269839179dc9513b9cd" Apr 16 16:05:57.570779 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:57.570747 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hccnt" Apr 16 16:05:57.598105 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:05:57.598075 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6bb999cf8c-zscrx" event={"ID":"8512020c-1ec9-4924-b45c-1368fce88a49","Type":"ContainerStarted","Data":"b58e42690578bff1ddea8d13e8b27ea0dfd7efdd0b3478d43d12e34e4282de7d"} Apr 16 16:06:00.789864 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.789826 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8"] Apr 16 16:06:00.793047 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.793029 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-lc5qk"] Apr 16 16:06:00.793201 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.793181 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.796720 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.796697 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.798666 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.798644 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 16:06:00.798784 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.798644 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:06:00.798921 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.798909 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:06:00.799293 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.799280 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vqwp6\"" Apr 16 16:06:00.799865 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.799842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:06:00.799973 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.799892 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:06:00.799973 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.799897 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:06:00.799973 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.799897 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-lsbnf\"" Apr 16 16:06:00.800412 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.800162 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:06:00.801255 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.801238 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:06:00.810603 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.810581 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8"] Apr 16 16:06:00.821539 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp97c\" (UniqueName: \"kubernetes.io/projected/943564a8-f27b-4d3d-950c-923624bfb085-kube-api-access-fp97c\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.821656 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcps\" (UniqueName: \"kubernetes.io/projected/6f959bf3-102d-4116-8396-d8ae64211287-kube-api-access-4vcps\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.821656 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/943564a8-f27b-4d3d-950c-923624bfb085-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.821656 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/943564a8-f27b-4d3d-950c-923624bfb085-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.821656 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.821864 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.821864 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f959bf3-102d-4116-8396-d8ae64211287-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.821864 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f959bf3-102d-4116-8396-d8ae64211287-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.821864 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f959bf3-102d-4116-8396-d8ae64211287-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.821864 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.821852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.839754 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.839727 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-lc5qk"] Apr 16 16:06:00.863752 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.863719 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v8g2p"] Apr 16 16:06:00.866809 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.866791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.873235 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.873210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:06:00.873235 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.873230 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:06:00.873435 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.873375 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:06:00.873638 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.873535 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nrtjc\"" Apr 16 16:06:00.922761 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-wtmp\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.922761 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/943564a8-f27b-4d3d-950c-923624bfb085-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.922980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-tls\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.922980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f959bf3-102d-4116-8396-d8ae64211287-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.922980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-sys\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.922980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp97c\" (UniqueName: \"kubernetes.io/projected/943564a8-f27b-4d3d-950c-923624bfb085-kube-api-access-fp97c\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.922980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.922980 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-accelerators-collector-config\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.923238 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.922996 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de882888-cb2e-44b5-b474-0352b65e5a08-metrics-client-ca\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.923238 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcps\" (UniqueName: \"kubernetes.io/projected/6f959bf3-102d-4116-8396-d8ae64211287-kube-api-access-4vcps\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.923238 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.923238 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-root\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.923238 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.923238 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9p8h\" (UniqueName: \"kubernetes.io/projected/de882888-cb2e-44b5-b474-0352b65e5a08-kube-api-access-j9p8h\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.923549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/943564a8-f27b-4d3d-950c-923624bfb085-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.923549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f959bf3-102d-4116-8396-d8ae64211287-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.923549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.923549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f959bf3-102d-4116-8396-d8ae64211287-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.923549 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-textfile\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:00.923754 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/943564a8-f27b-4d3d-950c-923624bfb085-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.923754 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.923691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/943564a8-f27b-4d3d-950c-923624bfb085-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.924315 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.924293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.925220 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.925190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f959bf3-102d-4116-8396-d8ae64211287-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.925799 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.925770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.925899 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.925870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f959bf3-102d-4116-8396-d8ae64211287-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.925953 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.925896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/943564a8-f27b-4d3d-950c-923624bfb085-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:00.926661 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.926637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f959bf3-102d-4116-8396-d8ae64211287-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.933611 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.933588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcps\" (UniqueName: \"kubernetes.io/projected/6f959bf3-102d-4116-8396-d8ae64211287-kube-api-access-4vcps\") pod \"openshift-state-metrics-5669946b84-2cdf8\" (UID: \"6f959bf3-102d-4116-8396-d8ae64211287\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:00.935766 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:00.935749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp97c\" (UniqueName: \"kubernetes.io/projected/943564a8-f27b-4d3d-950c-923624bfb085-kube-api-access-fp97c\") pod \"kube-state-metrics-7479c89684-lc5qk\" (UID: \"943564a8-f27b-4d3d-950c-923624bfb085\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:01.024444 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-accelerators-collector-config\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024444 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de882888-cb2e-44b5-b474-0352b65e5a08-metrics-client-ca\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-root\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9p8h\" (UniqueName: \"kubernetes.io/projected/de882888-cb2e-44b5-b474-0352b65e5a08-kube-api-access-j9p8h\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-textfile\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-wtmp\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-tls\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-root\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.024718 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-sys\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.025080 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:06:01.024757 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:06:01.025080 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-wtmp\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.025080 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.024790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de882888-cb2e-44b5-b474-0352b65e5a08-sys\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.025080 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:06:01.024834 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-tls podName:de882888-cb2e-44b5-b474-0352b65e5a08 nodeName:}" failed. No retries permitted until 2026-04-16 16:06:01.524813101 +0000 UTC m=+179.039851183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-tls") pod "node-exporter-v8g2p" (UID: "de882888-cb2e-44b5-b474-0352b65e5a08") : secret "node-exporter-tls" not found Apr 16 16:06:01.025219 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.025087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-accelerators-collector-config\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.025219 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.025187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de882888-cb2e-44b5-b474-0352b65e5a08-metrics-client-ca\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.025219 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.025185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-textfile\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.027275 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.027253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.035191 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.035169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9p8h\" (UniqueName: \"kubernetes.io/projected/de882888-cb2e-44b5-b474-0352b65e5a08-kube-api-access-j9p8h\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.102772 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.102691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" Apr 16 16:06:01.107926 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.107904 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" Apr 16 16:06:01.251059 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.251024 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8"] Apr 16 16:06:01.255483 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:06:01.255440 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f959bf3_102d_4116_8396_d8ae64211287.slice/crio-4ec14e05931d4b0a8635eefb459589e3bc43e93b62cecd80f3f8d3118f53a1f0 WatchSource:0}: Error finding container 4ec14e05931d4b0a8635eefb459589e3bc43e93b62cecd80f3f8d3118f53a1f0: Status 404 returned error can't find the container with id 4ec14e05931d4b0a8635eefb459589e3bc43e93b62cecd80f3f8d3118f53a1f0 Apr 16 16:06:01.285855 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.285828 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-lc5qk"] Apr 16 16:06:01.289203 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:06:01.289163 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943564a8_f27b_4d3d_950c_923624bfb085.slice/crio-4d6c9ed718df6fd133ba1d81f2f151376818e3ee827a0bdd5a7d8e7fbd42b684 WatchSource:0}: Error finding container 4d6c9ed718df6fd133ba1d81f2f151376818e3ee827a0bdd5a7d8e7fbd42b684: Status 404 returned error can't find the container with id 4d6c9ed718df6fd133ba1d81f2f151376818e3ee827a0bdd5a7d8e7fbd42b684 Apr 16 16:06:01.538269 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.538228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-tls\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.540685 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.540658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/de882888-cb2e-44b5-b474-0352b65e5a08-node-exporter-tls\") pod \"node-exporter-v8g2p\" (UID: \"de882888-cb2e-44b5-b474-0352b65e5a08\") " pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.609411 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.609323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" event={"ID":"943564a8-f27b-4d3d-950c-923624bfb085","Type":"ContainerStarted","Data":"4d6c9ed718df6fd133ba1d81f2f151376818e3ee827a0bdd5a7d8e7fbd42b684"} Apr 16 16:06:01.610687 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.610662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" event={"ID":"6f959bf3-102d-4116-8396-d8ae64211287","Type":"ContainerStarted","Data":"ff2e2db7c55238a4d0993dae5a61b6677eb9478be90373b6987a5f9374d7046e"} Apr 16 16:06:01.610803 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.610694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" event={"ID":"6f959bf3-102d-4116-8396-d8ae64211287","Type":"ContainerStarted","Data":"cdd3ea1801b60557615b98c1817aa7e48515463439cde841416abd6fe1bd718b"} Apr 16 16:06:01.610803 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.610707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" event={"ID":"6f959bf3-102d-4116-8396-d8ae64211287","Type":"ContainerStarted","Data":"4ec14e05931d4b0a8635eefb459589e3bc43e93b62cecd80f3f8d3118f53a1f0"} Apr 16 16:06:01.776240 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:01.776202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v8g2p" Apr 16 16:06:01.784734 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:06:01.784709 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde882888_cb2e_44b5_b474_0352b65e5a08.slice/crio-647c30eec6af87a6088626bb8f8b8ed85ffcb3a4366a122caf1fadee26aa72fb WatchSource:0}: Error finding container 647c30eec6af87a6088626bb8f8b8ed85ffcb3a4366a122caf1fadee26aa72fb: Status 404 returned error can't find the container with id 647c30eec6af87a6088626bb8f8b8ed85ffcb3a4366a122caf1fadee26aa72fb Apr 16 16:06:02.620171 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:02.620132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8g2p" event={"ID":"de882888-cb2e-44b5-b474-0352b65e5a08","Type":"ContainerStarted","Data":"647c30eec6af87a6088626bb8f8b8ed85ffcb3a4366a122caf1fadee26aa72fb"} Apr 16 16:06:03.624486 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.624382 2577 generic.go:358] "Generic (PLEG): container finished" podID="de882888-cb2e-44b5-b474-0352b65e5a08" containerID="0c01267c54141c8787c6c9cd0b3153147e70512c44542458d20cd624fb8a758b" exitCode=0 Apr 16 16:06:03.624486 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.624472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8g2p" event={"ID":"de882888-cb2e-44b5-b474-0352b65e5a08","Type":"ContainerDied","Data":"0c01267c54141c8787c6c9cd0b3153147e70512c44542458d20cd624fb8a758b"} Apr 16 16:06:03.626327 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.626299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" event={"ID":"6f959bf3-102d-4116-8396-d8ae64211287","Type":"ContainerStarted","Data":"dd162715b7126b0f945b82f3279759813834523f2187aedd74b20871ad875923"} Apr 16 16:06:03.628184 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.628165 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" event={"ID":"943564a8-f27b-4d3d-950c-923624bfb085","Type":"ContainerStarted","Data":"2a144baf1670f70a4a72314bcfb684921a1f208778a97a2259a3a8bd3f2eaa5d"} Apr 16 16:06:03.628262 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.628189 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" event={"ID":"943564a8-f27b-4d3d-950c-923624bfb085","Type":"ContainerStarted","Data":"0d685277833e6c025c2ed3db90ca295c95da9187424e3dd0a2c28d383ed2b4d2"} Apr 16 16:06:03.628262 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.628198 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" event={"ID":"943564a8-f27b-4d3d-950c-923624bfb085","Type":"ContainerStarted","Data":"e83b3e5230c1a0fc01fa40816138be24507c45478305ecb595cb2b829c0eb649"} Apr 16 16:06:03.667952 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:03.667875 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-2cdf8" podStartSLOduration=2.147439452 podStartE2EDuration="3.667855715s" podCreationTimestamp="2026-04-16 16:06:00 +0000 UTC" firstStartedPulling="2026-04-16 16:06:01.372693943 +0000 UTC m=+178.887732018" lastFinishedPulling="2026-04-16 16:06:02.893110193 +0000 UTC m=+180.408148281" observedRunningTime="2026-04-16 16:06:03.666014792 +0000 UTC m=+181.181052885" watchObservedRunningTime="2026-04-16 16:06:03.667855715 +0000 UTC m=+181.182893812" Apr 16 16:06:04.635106 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:04.635072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8g2p" event={"ID":"de882888-cb2e-44b5-b474-0352b65e5a08","Type":"ContainerStarted","Data":"4aeb8b0e5a3cebd3421a588d88d0d255e8e046175372f55059c07c711cdccfc2"} Apr 16 16:06:04.635567 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:04.635496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v8g2p" event={"ID":"de882888-cb2e-44b5-b474-0352b65e5a08","Type":"ContainerStarted","Data":"4cfec2e86efb91ee9585066f621072a6abacedcf1611ff6183279774e78a9912"} Apr 16 16:06:04.655550 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:04.655481 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-lc5qk" podStartSLOduration=3.053494765 podStartE2EDuration="4.655464848s" podCreationTimestamp="2026-04-16 16:06:00 +0000 UTC" firstStartedPulling="2026-04-16 16:06:01.291137093 +0000 UTC m=+178.806175170" lastFinishedPulling="2026-04-16 16:06:02.893107176 +0000 UTC m=+180.408145253" observedRunningTime="2026-04-16 16:06:03.691396213 +0000 UTC m=+181.206434310" watchObservedRunningTime="2026-04-16 16:06:04.655464848 +0000 UTC m=+182.170502944" Apr 16 16:06:04.655701 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:04.655596 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v8g2p" podStartSLOduration=3.502408478 podStartE2EDuration="4.655591569s" podCreationTimestamp="2026-04-16 16:06:00 +0000 UTC" firstStartedPulling="2026-04-16 16:06:01.786378034 +0000 UTC m=+179.301416108" lastFinishedPulling="2026-04-16 16:06:02.939561121 +0000 UTC m=+180.454599199" observedRunningTime="2026-04-16 16:06:04.654835676 +0000 UTC m=+182.169873765" watchObservedRunningTime="2026-04-16 16:06:04.655591569 +0000 UTC m=+182.170629665" Apr 16 16:06:42.904097 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:42.904037 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" podUID="d759bb16-79f7-47eb-93fd-14305f8a850f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:06:52.904530 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:06:52.904465 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" podUID="d759bb16-79f7-47eb-93fd-14305f8a850f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:07:02.904362 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:02.904327 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" podUID="d759bb16-79f7-47eb-93fd-14305f8a850f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:07:02.904763 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:02.904406 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" Apr 16 16:07:02.904965 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:02.904935 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"65ef498676a2c384738bce47dce65e25c3630e06f89b07dcb4150ade6348941f"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 16:07:02.905016 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:02.905001 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" podUID="d759bb16-79f7-47eb-93fd-14305f8a850f" containerName="service-proxy" containerID="cri-o://65ef498676a2c384738bce47dce65e25c3630e06f89b07dcb4150ade6348941f" gracePeriod=30 Apr 16 16:07:03.798256 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:03.798222 2577 generic.go:358] "Generic (PLEG): container finished" podID="d759bb16-79f7-47eb-93fd-14305f8a850f" containerID="65ef498676a2c384738bce47dce65e25c3630e06f89b07dcb4150ade6348941f" exitCode=2 Apr 16 16:07:03.798437 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:03.798293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" event={"ID":"d759bb16-79f7-47eb-93fd-14305f8a850f","Type":"ContainerDied","Data":"65ef498676a2c384738bce47dce65e25c3630e06f89b07dcb4150ade6348941f"} Apr 16 16:07:03.798437 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:03.798328 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f544d458-mn4b4" event={"ID":"d759bb16-79f7-47eb-93fd-14305f8a850f","Type":"ContainerStarted","Data":"6ff6f83a63bad48db7ef1f44ec91b196c982827204c1fa1448838916dbd8580d"} Apr 16 16:07:14.792838 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:14.792780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:07:14.795222 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:14.795193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd5274ed-46c8-46c2-a74e-26859678b08d-metrics-certs\") pod \"network-metrics-daemon-2mqsw\" (UID: \"dd5274ed-46c8-46c2-a74e-26859678b08d\") " pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:07:14.996546 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:14.996490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hwq6n\"" Apr 16 16:07:15.004552 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:15.004531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mqsw" Apr 16 16:07:15.125595 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:15.125566 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2mqsw"] Apr 16 16:07:15.128940 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:07:15.128911 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5274ed_46c8_46c2_a74e_26859678b08d.slice/crio-ea0b5f6a35fbb783fbe5dfdec0437ad1371d5e93f12f2690e54bb5fece0402ce WatchSource:0}: Error finding container ea0b5f6a35fbb783fbe5dfdec0437ad1371d5e93f12f2690e54bb5fece0402ce: Status 404 returned error can't find the container with id ea0b5f6a35fbb783fbe5dfdec0437ad1371d5e93f12f2690e54bb5fece0402ce Apr 16 16:07:15.832109 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:15.832027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2mqsw" event={"ID":"dd5274ed-46c8-46c2-a74e-26859678b08d","Type":"ContainerStarted","Data":"ea0b5f6a35fbb783fbe5dfdec0437ad1371d5e93f12f2690e54bb5fece0402ce"} Apr 16 16:07:16.838656 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:16.838615 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2mqsw" event={"ID":"dd5274ed-46c8-46c2-a74e-26859678b08d","Type":"ContainerStarted","Data":"9a13409bf2772d6be0fd5da62ec7ac3787a63973fdf3dfe6db87efe892923bd7"} Apr 16 16:07:16.838656 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:16.838661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2mqsw" event={"ID":"dd5274ed-46c8-46c2-a74e-26859678b08d","Type":"ContainerStarted","Data":"6789f26175cfdb15d6b924c3b4b9b1dd18dae9bcc664090d00fc5fa49e7d837b"} Apr 16 16:07:16.857607 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:16.857561 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2mqsw" podStartSLOduration=252.815969274 podStartE2EDuration="4m13.857544353s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:07:15.130825729 +0000 UTC m=+252.645863806" lastFinishedPulling="2026-04-16 16:07:16.172400796 +0000 UTC m=+253.687438885" observedRunningTime="2026-04-16 16:07:16.855988808 +0000 UTC m=+254.371026918" watchObservedRunningTime="2026-04-16 16:07:16.857544353 +0000 UTC m=+254.372582448" Apr 16 16:07:58.384890 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.384857 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m9sr4"] Apr 16 16:07:58.389006 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.388985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.391441 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.391423 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:07:58.397004 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.396984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m9sr4"] Apr 16 16:07:58.490879 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.490839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b315b40d-dd6d-45a2-aab1-242618785118-dbus\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.490879 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.490882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b315b40d-dd6d-45a2-aab1-242618785118-original-pull-secret\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.491151 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.490905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b315b40d-dd6d-45a2-aab1-242618785118-kubelet-config\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.592180 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.592137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b315b40d-dd6d-45a2-aab1-242618785118-dbus\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.592180 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.592177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b315b40d-dd6d-45a2-aab1-242618785118-original-pull-secret\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.592457 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.592203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b315b40d-dd6d-45a2-aab1-242618785118-kubelet-config\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.592457 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.592283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b315b40d-dd6d-45a2-aab1-242618785118-kubelet-config\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.592457 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.592329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b315b40d-dd6d-45a2-aab1-242618785118-dbus\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.594887 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.594852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b315b40d-dd6d-45a2-aab1-242618785118-original-pull-secret\") pod \"global-pull-secret-syncer-m9sr4\" (UID: \"b315b40d-dd6d-45a2-aab1-242618785118\") " pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.698779 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.698682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m9sr4" Apr 16 16:07:58.816891 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.816861 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m9sr4"] Apr 16 16:07:58.820543 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:07:58.820493 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb315b40d_dd6d_45a2_aab1_242618785118.slice/crio-672bc4f5d2e99308ff4d9fe132349aa03b949af9d9a6c22750d0fa5d5e25b8c1 WatchSource:0}: Error finding container 672bc4f5d2e99308ff4d9fe132349aa03b949af9d9a6c22750d0fa5d5e25b8c1: Status 404 returned error can't find the container with id 672bc4f5d2e99308ff4d9fe132349aa03b949af9d9a6c22750d0fa5d5e25b8c1 Apr 16 16:07:58.950443 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:07:58.950351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m9sr4" event={"ID":"b315b40d-dd6d-45a2-aab1-242618785118","Type":"ContainerStarted","Data":"672bc4f5d2e99308ff4d9fe132349aa03b949af9d9a6c22750d0fa5d5e25b8c1"} Apr 16 16:08:02.973384 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:08:02.973351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:08:02.973971 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:08:02.973949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:08:02.978033 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:08:02.978011 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:08:03.972362 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:08:03.972328 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m9sr4" event={"ID":"b315b40d-dd6d-45a2-aab1-242618785118","Type":"ContainerStarted","Data":"12cb9741003166a9cd5ac17cf81bdffdde19469906f116554d672276772c4c93"} Apr 16 16:08:03.987380 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:08:03.987328 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m9sr4" podStartSLOduration=1.925130359 podStartE2EDuration="5.98731259s" podCreationTimestamp="2026-04-16 16:07:58 +0000 UTC" firstStartedPulling="2026-04-16 16:07:58.822262472 +0000 UTC m=+296.337300545" lastFinishedPulling="2026-04-16 16:08:02.884444698 +0000 UTC m=+300.399482776" observedRunningTime="2026-04-16 16:08:03.986937431 +0000 UTC m=+301.501975525" watchObservedRunningTime="2026-04-16 16:08:03.98731259 +0000 UTC m=+301.502350685" Apr 16 16:09:45.875388 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.875350 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-lb57q"] Apr 16 16:09:45.878141 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.878123 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:45.880748 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.880723 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 16:09:45.880863 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.880820 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:09:45.880950 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.880936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:09:45.881684 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.881665 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:09:45.881761 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.881743 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-trhqz\"" Apr 16 16:09:45.887367 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.887332 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lb57q"] Apr 16 16:09:45.928278 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.928241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4g2\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-kube-api-access-8l4g2\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:45.928447 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:45.928292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-certificates\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.029187 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.029150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4g2\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-kube-api-access-8l4g2\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.029391 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.029199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-certificates\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.029391 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:09:46.029293 2577 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 16:09:46.029391 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:09:46.029311 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-lb57q: secret "keda-admission-webhooks-certs" not found Apr 16 16:09:46.029391 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:09:46.029371 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-certificates podName:68b295f8-9430-4ac5-bfde-76a63fb4588f nodeName:}" failed. No retries permitted until 2026-04-16 16:09:46.529354589 +0000 UTC m=+404.044392662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-certificates") pod "keda-admission-cf49989db-lb57q" (UID: "68b295f8-9430-4ac5-bfde-76a63fb4588f") : secret "keda-admission-webhooks-certs" not found Apr 16 16:09:46.045295 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.045262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4g2\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-kube-api-access-8l4g2\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.533949 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.533907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-certificates\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.536528 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.536491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/68b295f8-9430-4ac5-bfde-76a63fb4588f-certificates\") pod \"keda-admission-cf49989db-lb57q\" (UID: \"68b295f8-9430-4ac5-bfde-76a63fb4588f\") " pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.788823 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.788698 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:46.911463 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.911423 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lb57q"] Apr 16 16:09:46.915110 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:09:46.915081 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68b295f8_9430_4ac5_bfde_76a63fb4588f.slice/crio-191907f4cb8925d7a9e26a0a0fccc60f7eed43d9f87a93f9f11b9119a8f21bd1 WatchSource:0}: Error finding container 191907f4cb8925d7a9e26a0a0fccc60f7eed43d9f87a93f9f11b9119a8f21bd1: Status 404 returned error can't find the container with id 191907f4cb8925d7a9e26a0a0fccc60f7eed43d9f87a93f9f11b9119a8f21bd1 Apr 16 16:09:46.916322 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:46.916305 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:09:47.248568 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:47.248489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lb57q" event={"ID":"68b295f8-9430-4ac5-bfde-76a63fb4588f","Type":"ContainerStarted","Data":"191907f4cb8925d7a9e26a0a0fccc60f7eed43d9f87a93f9f11b9119a8f21bd1"} Apr 16 16:09:49.256244 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:49.256147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lb57q" event={"ID":"68b295f8-9430-4ac5-bfde-76a63fb4588f","Type":"ContainerStarted","Data":"fed289504098bb7d11f807d10ca3a6d9bdb3ceb3cce9568d9527e73ce12758d3"} Apr 16 16:09:49.256244 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:49.256217 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:09:49.275550 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:09:49.275481 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-lb57q" podStartSLOduration=2.219403263 podStartE2EDuration="4.275465113s" podCreationTimestamp="2026-04-16 16:09:45 +0000 UTC" firstStartedPulling="2026-04-16 16:09:46.916427258 +0000 UTC m=+404.431465332" lastFinishedPulling="2026-04-16 16:09:48.972489108 +0000 UTC m=+406.487527182" observedRunningTime="2026-04-16 16:09:49.274354243 +0000 UTC m=+406.789392341" watchObservedRunningTime="2026-04-16 16:09:49.275465113 +0000 UTC m=+406.790503209" Apr 16 16:10:10.260914 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:10.260887 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-lb57q" Apr 16 16:10:51.192826 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.192711 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-kj22m"] Apr 16 16:10:51.199165 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.195170 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.199949 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.199918 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-qhttj\"" Apr 16 16:10:51.200592 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.200570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:10:51.200729 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.200570 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:10:51.200807 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.200597 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:10:51.204379 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.204355 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-kj22m"] Apr 16 16:10:51.221121 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.221096 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-mzfnp"] Apr 16 16:10:51.223214 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.223192 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.225467 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.225449 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:10:51.225635 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.225618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-jdvqm\"" Apr 16 16:10:51.234566 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.234546 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mzfnp"] Apr 16 16:10:51.289193 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.289161 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad9ba298-ec3a-4e1d-b409-0848a1cad1ed-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kj22m\" (UID: \"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.289363 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.289213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdzkv\" (UniqueName: \"kubernetes.io/projected/ad9ba298-ec3a-4e1d-b409-0848a1cad1ed-kube-api-access-bdzkv\") pod \"llmisvc-controller-manager-68cc5db7c4-kj22m\" (UID: \"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.289363 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.289286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbqff\" (UniqueName: \"kubernetes.io/projected/2587e677-476b-40cf-b78f-23ac06bc584b-kube-api-access-pbqff\") pod \"seaweedfs-86cc847c5c-mzfnp\" (UID: \"2587e677-476b-40cf-b78f-23ac06bc584b\") " pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.289363 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.289349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2587e677-476b-40cf-b78f-23ac06bc584b-data\") pod \"seaweedfs-86cc847c5c-mzfnp\" (UID: \"2587e677-476b-40cf-b78f-23ac06bc584b\") " pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.389779 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.389742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdzkv\" (UniqueName: \"kubernetes.io/projected/ad9ba298-ec3a-4e1d-b409-0848a1cad1ed-kube-api-access-bdzkv\") pod \"llmisvc-controller-manager-68cc5db7c4-kj22m\" (UID: \"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.389985 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.389797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbqff\" (UniqueName: \"kubernetes.io/projected/2587e677-476b-40cf-b78f-23ac06bc584b-kube-api-access-pbqff\") pod \"seaweedfs-86cc847c5c-mzfnp\" (UID: \"2587e677-476b-40cf-b78f-23ac06bc584b\") " pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.389985 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.389824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2587e677-476b-40cf-b78f-23ac06bc584b-data\") pod \"seaweedfs-86cc847c5c-mzfnp\" (UID: \"2587e677-476b-40cf-b78f-23ac06bc584b\") " pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.389985 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.389854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad9ba298-ec3a-4e1d-b409-0848a1cad1ed-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kj22m\" (UID: \"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.390219 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.390199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2587e677-476b-40cf-b78f-23ac06bc584b-data\") pod \"seaweedfs-86cc847c5c-mzfnp\" (UID: \"2587e677-476b-40cf-b78f-23ac06bc584b\") " pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.392455 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.392433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad9ba298-ec3a-4e1d-b409-0848a1cad1ed-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kj22m\" (UID: \"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.397550 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.397524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdzkv\" (UniqueName: \"kubernetes.io/projected/ad9ba298-ec3a-4e1d-b409-0848a1cad1ed-kube-api-access-bdzkv\") pod \"llmisvc-controller-manager-68cc5db7c4-kj22m\" (UID: \"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.397674 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.397649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbqff\" (UniqueName: \"kubernetes.io/projected/2587e677-476b-40cf-b78f-23ac06bc584b-kube-api-access-pbqff\") pod \"seaweedfs-86cc847c5c-mzfnp\" (UID: \"2587e677-476b-40cf-b78f-23ac06bc584b\") " pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.509664 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.509559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:51.532339 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.532308 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:51.642147 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.642056 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-kj22m"] Apr 16 16:10:51.644926 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:10:51.644882 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podad9ba298_ec3a_4e1d_b409_0848a1cad1ed.slice/crio-1a979507fc397e8c43d3474af5d40a9c6f0d97f5fcfa275713b96127047e4bef WatchSource:0}: Error finding container 1a979507fc397e8c43d3474af5d40a9c6f0d97f5fcfa275713b96127047e4bef: Status 404 returned error can't find the container with id 1a979507fc397e8c43d3474af5d40a9c6f0d97f5fcfa275713b96127047e4bef Apr 16 16:10:51.661056 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:51.661028 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mzfnp"] Apr 16 16:10:51.663689 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:10:51.663646 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2587e677_476b_40cf_b78f_23ac06bc584b.slice/crio-4c730da30e8c2a36fe1169eb440c8fe0a32266e5b3dd8c3ae5c012936bb519a8 WatchSource:0}: Error finding container 4c730da30e8c2a36fe1169eb440c8fe0a32266e5b3dd8c3ae5c012936bb519a8: Status 404 returned error can't find the container with id 4c730da30e8c2a36fe1169eb440c8fe0a32266e5b3dd8c3ae5c012936bb519a8 Apr 16 16:10:52.427344 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:52.427305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mzfnp" event={"ID":"2587e677-476b-40cf-b78f-23ac06bc584b","Type":"ContainerStarted","Data":"4c730da30e8c2a36fe1169eb440c8fe0a32266e5b3dd8c3ae5c012936bb519a8"} Apr 16 16:10:52.428862 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:52.428812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" event={"ID":"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed","Type":"ContainerStarted","Data":"1a979507fc397e8c43d3474af5d40a9c6f0d97f5fcfa275713b96127047e4bef"} Apr 16 16:10:55.438614 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:55.438574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mzfnp" event={"ID":"2587e677-476b-40cf-b78f-23ac06bc584b","Type":"ContainerStarted","Data":"ad6e92bfb1680336bfc6522baf922ebee953278922f0c2ef7fe7d861002657c0"} Apr 16 16:10:55.439030 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:55.438683 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:10:55.440048 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:55.440021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" event={"ID":"ad9ba298-ec3a-4e1d-b409-0848a1cad1ed","Type":"ContainerStarted","Data":"79ca0270cec007e887e8690030d372cf7da41fe33f8432e7666555a6d02bdd3a"} Apr 16 16:10:55.440151 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:55.440107 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:10:55.455339 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:55.455290 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-mzfnp" podStartSLOduration=1.245266822 podStartE2EDuration="4.455275895s" podCreationTimestamp="2026-04-16 16:10:51 +0000 UTC" firstStartedPulling="2026-04-16 16:10:51.665021494 +0000 UTC m=+469.180059568" lastFinishedPulling="2026-04-16 16:10:54.875030566 +0000 UTC m=+472.390068641" observedRunningTime="2026-04-16 16:10:55.453609195 +0000 UTC m=+472.968647292" watchObservedRunningTime="2026-04-16 16:10:55.455275895 +0000 UTC m=+472.970313990" Apr 16 16:10:55.470911 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:10:55.470865 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" podStartSLOduration=1.296336648 podStartE2EDuration="4.470853055s" podCreationTimestamp="2026-04-16 16:10:51 +0000 UTC" firstStartedPulling="2026-04-16 16:10:51.646160787 +0000 UTC m=+469.161198863" lastFinishedPulling="2026-04-16 16:10:54.820677194 +0000 UTC m=+472.335715270" observedRunningTime="2026-04-16 16:10:55.46974414 +0000 UTC m=+472.984782241" watchObservedRunningTime="2026-04-16 16:10:55.470853055 +0000 UTC m=+472.985891150" Apr 16 16:11:01.445382 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:11:01.445348 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-mzfnp" Apr 16 16:11:26.445771 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:11:26.445741 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kj22m" Apr 16 16:12:00.657234 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.657201 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6c9kq"] Apr 16 16:12:00.659661 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.659636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:00.662321 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.662297 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-vtbvk\"" Apr 16 16:12:00.662436 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.662334 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:12:00.669130 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.669107 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6c9kq"] Apr 16 16:12:00.678044 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.678017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c965d9c-d49d-4edc-977c-7912744b335d-cert\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:00.678147 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.678118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jzq\" (UniqueName: \"kubernetes.io/projected/9c965d9c-d49d-4edc-977c-7912744b335d-kube-api-access-w4jzq\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:00.778480 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.778444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4jzq\" (UniqueName: \"kubernetes.io/projected/9c965d9c-d49d-4edc-977c-7912744b335d-kube-api-access-w4jzq\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:00.778480 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.778486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c965d9c-d49d-4edc-977c-7912744b335d-cert\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:00.778761 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:12:00.778633 2577 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 16:12:00.778761 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:12:00.778694 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c965d9c-d49d-4edc-977c-7912744b335d-cert podName:9c965d9c-d49d-4edc-977c-7912744b335d nodeName:}" failed. No retries permitted until 2026-04-16 16:12:01.278674696 +0000 UTC m=+538.793712770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c965d9c-d49d-4edc-977c-7912744b335d-cert") pod "odh-model-controller-696fc77849-6c9kq" (UID: "9c965d9c-d49d-4edc-977c-7912744b335d") : secret "odh-model-controller-webhook-cert" not found Apr 16 16:12:00.787909 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:00.787882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4jzq\" (UniqueName: \"kubernetes.io/projected/9c965d9c-d49d-4edc-977c-7912744b335d-kube-api-access-w4jzq\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:01.281426 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:01.281392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c965d9c-d49d-4edc-977c-7912744b335d-cert\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:01.283875 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:01.283853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c965d9c-d49d-4edc-977c-7912744b335d-cert\") pod \"odh-model-controller-696fc77849-6c9kq\" (UID: \"9c965d9c-d49d-4edc-977c-7912744b335d\") " pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:01.570161 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:01.570068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:01.691812 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:01.691784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6c9kq"] Apr 16 16:12:01.694302 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:12:01.694272 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c965d9c_d49d_4edc_977c_7912744b335d.slice/crio-a96c9168c776f743ada5ce69e377041404bb9c8209259b6dfb78b16e43a1bb61 WatchSource:0}: Error finding container a96c9168c776f743ada5ce69e377041404bb9c8209259b6dfb78b16e43a1bb61: Status 404 returned error can't find the container with id a96c9168c776f743ada5ce69e377041404bb9c8209259b6dfb78b16e43a1bb61 Apr 16 16:12:02.626636 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:02.626601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6c9kq" event={"ID":"9c965d9c-d49d-4edc-977c-7912744b335d","Type":"ContainerStarted","Data":"a96c9168c776f743ada5ce69e377041404bb9c8209259b6dfb78b16e43a1bb61"} Apr 16 16:12:04.636213 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:04.636182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6c9kq" event={"ID":"9c965d9c-d49d-4edc-977c-7912744b335d","Type":"ContainerStarted","Data":"61a97c8c29aa5515bc6ad56dce47ede686b6e75053bad60a8777147968de1b28"} Apr 16 16:12:04.636589 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:04.636337 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:04.654146 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:04.654091 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6c9kq" podStartSLOduration=2.005604306 podStartE2EDuration="4.654070587s" podCreationTimestamp="2026-04-16 16:12:00 +0000 UTC" firstStartedPulling="2026-04-16 16:12:01.695661115 +0000 UTC m=+539.210699190" lastFinishedPulling="2026-04-16 16:12:04.344127395 +0000 UTC m=+541.859165471" observedRunningTime="2026-04-16 16:12:04.652000149 +0000 UTC m=+542.167038244" watchObservedRunningTime="2026-04-16 16:12:04.654070587 +0000 UTC m=+542.169108685" Apr 16 16:12:15.641145 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:15.641109 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6c9kq" Apr 16 16:12:27.911529 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:27.911484 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx"] Apr 16 16:12:27.914027 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:27.914004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:27.917466 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:27.917444 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 16:12:27.956653 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:27.956624 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx"] Apr 16 16:12:27.994601 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:27.994565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-wmdcx\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:27.994757 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:27.994649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xkd\" (UniqueName: \"kubernetes.io/projected/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-kube-api-access-w2xkd\") pod \"seaweedfs-tls-custom-ddd4dbfd-wmdcx\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:28.095538 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.095472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xkd\" (UniqueName: \"kubernetes.io/projected/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-kube-api-access-w2xkd\") pod \"seaweedfs-tls-custom-ddd4dbfd-wmdcx\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:28.095736 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.095550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-wmdcx\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:28.095939 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.095916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-wmdcx\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:28.103944 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.103922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xkd\" (UniqueName: \"kubernetes.io/projected/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-kube-api-access-w2xkd\") pod \"seaweedfs-tls-custom-ddd4dbfd-wmdcx\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:28.223538 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.223432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:28.342196 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.342163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx"] Apr 16 16:12:28.345571 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:12:28.345535 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2edb7a_3fb5_494d_a137_f8c9b0fef84d.slice/crio-827dcf950130e5685c7384039efdd04d2287cbc49df005622d0032ef3aed2527 WatchSource:0}: Error finding container 827dcf950130e5685c7384039efdd04d2287cbc49df005622d0032ef3aed2527: Status 404 returned error can't find the container with id 827dcf950130e5685c7384039efdd04d2287cbc49df005622d0032ef3aed2527 Apr 16 16:12:28.701747 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:28.701712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" event={"ID":"db2edb7a-3fb5-494d-a137-f8c9b0fef84d","Type":"ContainerStarted","Data":"827dcf950130e5685c7384039efdd04d2287cbc49df005622d0032ef3aed2527"} Apr 16 16:12:29.705934 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:29.705894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" event={"ID":"db2edb7a-3fb5-494d-a137-f8c9b0fef84d","Type":"ContainerStarted","Data":"5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7"} Apr 16 16:12:29.722693 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:29.722535 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" podStartSLOduration=2.419920275 podStartE2EDuration="2.722497594s" podCreationTimestamp="2026-04-16 16:12:27 +0000 UTC" firstStartedPulling="2026-04-16 16:12:28.346776369 +0000 UTC m=+565.861814443" lastFinishedPulling="2026-04-16 16:12:28.649353688 +0000 UTC m=+566.164391762" observedRunningTime="2026-04-16 16:12:29.72137611 +0000 UTC m=+567.236414206" watchObservedRunningTime="2026-04-16 16:12:29.722497594 +0000 UTC m=+567.237535697" Apr 16 16:12:30.511122 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:30.511086 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx"] Apr 16 16:12:31.711304 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:31.711251 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" podUID="db2edb7a-3fb5-494d-a137-f8c9b0fef84d" containerName="seaweedfs-tls-custom" containerID="cri-o://5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7" gracePeriod=30 Apr 16 16:12:32.946645 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:32.946621 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:33.035438 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.035337 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-data\") pod \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " Apr 16 16:12:33.035438 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.035417 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xkd\" (UniqueName: \"kubernetes.io/projected/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-kube-api-access-w2xkd\") pod \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\" (UID: \"db2edb7a-3fb5-494d-a137-f8c9b0fef84d\") " Apr 16 16:12:33.036635 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.036609 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-data" (OuterVolumeSpecName: "data") pod "db2edb7a-3fb5-494d-a137-f8c9b0fef84d" (UID: "db2edb7a-3fb5-494d-a137-f8c9b0fef84d"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:12:33.037763 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.037738 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-kube-api-access-w2xkd" (OuterVolumeSpecName: "kube-api-access-w2xkd") pod "db2edb7a-3fb5-494d-a137-f8c9b0fef84d" (UID: "db2edb7a-3fb5-494d-a137-f8c9b0fef84d"). InnerVolumeSpecName "kube-api-access-w2xkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:12:33.136218 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.136184 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2xkd\" (UniqueName: \"kubernetes.io/projected/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-kube-api-access-w2xkd\") on node \"ip-10-0-130-130.ec2.internal\" DevicePath \"\"" Apr 16 16:12:33.136218 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.136216 2577 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/db2edb7a-3fb5-494d-a137-f8c9b0fef84d-data\") on node \"ip-10-0-130-130.ec2.internal\" DevicePath \"\"" Apr 16 16:12:33.718632 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.718595 2577 generic.go:358] "Generic (PLEG): container finished" podID="db2edb7a-3fb5-494d-a137-f8c9b0fef84d" containerID="5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7" exitCode=0 Apr 16 16:12:33.718831 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.718659 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" Apr 16 16:12:33.718831 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.718680 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" event={"ID":"db2edb7a-3fb5-494d-a137-f8c9b0fef84d","Type":"ContainerDied","Data":"5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7"} Apr 16 16:12:33.718831 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.718720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx" event={"ID":"db2edb7a-3fb5-494d-a137-f8c9b0fef84d","Type":"ContainerDied","Data":"827dcf950130e5685c7384039efdd04d2287cbc49df005622d0032ef3aed2527"} Apr 16 16:12:33.718831 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.718735 2577 scope.go:117] "RemoveContainer" containerID="5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7" Apr 16 16:12:33.727331 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.727313 2577 scope.go:117] "RemoveContainer" containerID="5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7" Apr 16 16:12:33.727664 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:12:33.727641 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7\": container with ID starting with 5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7 not found: ID does not exist" containerID="5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7" Apr 16 16:12:33.727725 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.727673 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7"} err="failed to get container status \"5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7\": rpc error: code = NotFound desc = could not find container \"5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7\": container with ID starting with 5b28b917c0df3494c2b473ed5a20e3344172901bc0dcfa71960b6db39f270ba7 not found: ID does not exist" Apr 16 16:12:33.732586 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.732564 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx"] Apr 16 16:12:33.734932 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.734913 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-wmdcx"] Apr 16 16:12:33.760011 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.759986 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-p954m"] Apr 16 16:12:33.760266 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.760254 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db2edb7a-3fb5-494d-a137-f8c9b0fef84d" containerName="seaweedfs-tls-custom" Apr 16 16:12:33.760310 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.760268 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2edb7a-3fb5-494d-a137-f8c9b0fef84d" containerName="seaweedfs-tls-custom" Apr 16 16:12:33.760347 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.760312 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="db2edb7a-3fb5-494d-a137-f8c9b0fef84d" containerName="seaweedfs-tls-custom" Apr 16 16:12:33.763155 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.763137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.765299 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.765285 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 16:12:33.765368 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.765286 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 16:12:33.770176 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.770094 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-p954m"] Apr 16 16:12:33.841424 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.841385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/87ed86d3-443d-4db0-a511-f793db537a82-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.841632 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.841438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/87ed86d3-443d-4db0-a511-f793db537a82-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.841632 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.841546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmmh\" (UniqueName: \"kubernetes.io/projected/87ed86d3-443d-4db0-a511-f793db537a82-kube-api-access-lfmmh\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.942725 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.942684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/87ed86d3-443d-4db0-a511-f793db537a82-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.942888 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.942766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/87ed86d3-443d-4db0-a511-f793db537a82-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.942888 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.942813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmmh\" (UniqueName: \"kubernetes.io/projected/87ed86d3-443d-4db0-a511-f793db537a82-kube-api-access-lfmmh\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.943092 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.943073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/87ed86d3-443d-4db0-a511-f793db537a82-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.945295 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.945268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/87ed86d3-443d-4db0-a511-f793db537a82-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:33.951106 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:33.951086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmmh\" (UniqueName: \"kubernetes.io/projected/87ed86d3-443d-4db0-a511-f793db537a82-kube-api-access-lfmmh\") pod \"seaweedfs-tls-custom-5c88b85bb7-p954m\" (UID: \"87ed86d3-443d-4db0-a511-f793db537a82\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:34.073040 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:34.072960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" Apr 16 16:12:34.202015 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:34.201982 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-p954m"] Apr 16 16:12:34.205314 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:12:34.205286 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ed86d3_443d_4db0_a511_f793db537a82.slice/crio-8b6b489057f27b4b2ac65aa82eb80638036c3a86236e2072d8a7fbaa19e58731 WatchSource:0}: Error finding container 8b6b489057f27b4b2ac65aa82eb80638036c3a86236e2072d8a7fbaa19e58731: Status 404 returned error can't find the container with id 8b6b489057f27b4b2ac65aa82eb80638036c3a86236e2072d8a7fbaa19e58731 Apr 16 16:12:34.722731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:34.722642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" event={"ID":"87ed86d3-443d-4db0-a511-f793db537a82","Type":"ContainerStarted","Data":"2d8b0ffd60780fcd2a0f2b9c32327c7df1b58bd01b2ba15f06f04f352f7ff2b5"} Apr 16 16:12:34.722731 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:34.722681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" event={"ID":"87ed86d3-443d-4db0-a511-f793db537a82","Type":"ContainerStarted","Data":"8b6b489057f27b4b2ac65aa82eb80638036c3a86236e2072d8a7fbaa19e58731"} Apr 16 16:12:34.738196 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:34.738141 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-p954m" podStartSLOduration=1.471769719 podStartE2EDuration="1.738126202s" podCreationTimestamp="2026-04-16 16:12:33 +0000 UTC" firstStartedPulling="2026-04-16 16:12:34.206610198 +0000 UTC m=+571.721648272" lastFinishedPulling="2026-04-16 16:12:34.472966681 +0000 UTC m=+571.988004755" observedRunningTime="2026-04-16 16:12:34.736862404 +0000 UTC m=+572.251900500" watchObservedRunningTime="2026-04-16 16:12:34.738126202 +0000 UTC m=+572.253164298" Apr 16 16:12:35.095455 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:12:35.095369 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2edb7a-3fb5-494d-a137-f8c9b0fef84d" path="/var/lib/kubelet/pods/db2edb7a-3fb5-494d-a137-f8c9b0fef84d/volumes" Apr 16 16:13:02.998706 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:13:02.998678 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:13:02.999327 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:13:02.999306 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:16:05.229016 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.228984 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq"] Apr 16 16:16:05.231207 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.231184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" Apr 16 16:16:05.233711 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.233691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzmqv\"" Apr 16 16:16:05.241014 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.240988 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq"] Apr 16 16:16:05.241198 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.241184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" Apr 16 16:16:05.365369 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.365332 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq"] Apr 16 16:16:05.369454 ip-10-0-130-130 kubenswrapper[2577]: W0416 16:16:05.369427 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09954f09_ba99_4aeb_9317_183fb22fa929.slice/crio-99458fabdde981ecf4ad59e765aed628b5b9d1fd76fd86c5154f4763b03cf5e6 WatchSource:0}: Error finding container 99458fabdde981ecf4ad59e765aed628b5b9d1fd76fd86c5154f4763b03cf5e6: Status 404 returned error can't find the container with id 99458fabdde981ecf4ad59e765aed628b5b9d1fd76fd86c5154f4763b03cf5e6 Apr 16 16:16:05.371273 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:05.371257 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:16:06.316613 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:06.316571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" event={"ID":"09954f09-ba99-4aeb-9317-183fb22fa929","Type":"ContainerStarted","Data":"99458fabdde981ecf4ad59e765aed628b5b9d1fd76fd86c5154f4763b03cf5e6"} Apr 16 16:16:07.320546 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:07.320489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" event={"ID":"09954f09-ba99-4aeb-9317-183fb22fa929","Type":"ContainerStarted","Data":"ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26"} Apr 16 16:16:07.320935 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:07.320740 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" Apr 16 16:16:07.322394 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:07.322375 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" Apr 16 16:16:07.339754 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:16:07.339704 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" podStartSLOduration=1.3100818269999999 podStartE2EDuration="2.339688084s" podCreationTimestamp="2026-04-16 16:16:05 +0000 UTC" firstStartedPulling="2026-04-16 16:16:05.371384817 +0000 UTC m=+782.886422891" lastFinishedPulling="2026-04-16 16:16:06.40099106 +0000 UTC m=+783.916029148" observedRunningTime="2026-04-16 16:16:07.338331715 +0000 UTC m=+784.853369811" watchObservedRunningTime="2026-04-16 16:16:07.339688084 +0000 UTC m=+784.854726182" Apr 16 16:17:40.306522 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:40.306484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-79c7995f46-j6nkq_09954f09-ba99-4aeb-9317-183fb22fa929/kserve-container/0.log" Apr 16 16:17:40.592522 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:40.592421 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq"] Apr 16 16:17:40.592697 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:40.592672 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" podUID="09954f09-ba99-4aeb-9317-183fb22fa929" containerName="kserve-container" containerID="cri-o://ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26" gracePeriod=30 Apr 16 16:17:40.829412 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:40.829387 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" Apr 16 16:17:41.587757 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.587723 2577 generic.go:358] "Generic (PLEG): container finished" podID="09954f09-ba99-4aeb-9317-183fb22fa929" containerID="ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26" exitCode=2 Apr 16 16:17:41.588184 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.587781 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" Apr 16 16:17:41.588184 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.587816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" event={"ID":"09954f09-ba99-4aeb-9317-183fb22fa929","Type":"ContainerDied","Data":"ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26"} Apr 16 16:17:41.588184 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.587855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq" event={"ID":"09954f09-ba99-4aeb-9317-183fb22fa929","Type":"ContainerDied","Data":"99458fabdde981ecf4ad59e765aed628b5b9d1fd76fd86c5154f4763b03cf5e6"} Apr 16 16:17:41.588184 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.587871 2577 scope.go:117] "RemoveContainer" containerID="ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26" Apr 16 16:17:41.595984 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.595800 2577 scope.go:117] "RemoveContainer" containerID="ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26" Apr 16 16:17:41.596063 ip-10-0-130-130 kubenswrapper[2577]: E0416 16:17:41.596044 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26\": container with ID starting with ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26 not found: ID does not exist" containerID="ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26" Apr 16 16:17:41.596265 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.596072 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26"} err="failed to get container status \"ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26\": rpc error: code = NotFound desc = could not find container \"ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26\": container with ID starting with ee5ad5a54d3107c6f9a7f195fe906334c4ad39976e7452669ed8f08a5dc11e26 not found: ID does not exist" Apr 16 16:17:41.602165 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.601979 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq"] Apr 16 16:17:41.603550 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:41.603532 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-79c7995f46-j6nkq"] Apr 16 16:17:43.095026 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:17:43.094992 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09954f09-ba99-4aeb-9317-183fb22fa929" path="/var/lib/kubelet/pods/09954f09-ba99-4aeb-9317-183fb22fa929/volumes" Apr 16 16:18:03.020562 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:18:03.020534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:18:03.022020 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:18:03.021999 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:23:03.039263 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:23:03.039233 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:23:03.041324 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:23:03.041299 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:28:03.057920 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:28:03.057895 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:28:03.060298 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:28:03.060276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:33:03.077676 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:33:03.077642 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:33:03.079666 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:33:03.079638 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:38:03.097447 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:38:03.097420 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:38:03.100119 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:38:03.100086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:43:03.117944 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:43:03.117912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:43:03.120989 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:43:03.120962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:48:03.141274 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:48:03.141246 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:48:03.149074 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:48:03.149051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:53:03.162555 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:53:03.162493 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:53:03.174947 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:53:03.174920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:58:03.179666 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:58:03.179640 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 16:58:03.198934 ip-10-0-130-130 kubenswrapper[2577]: I0416 16:58:03.198908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 17:03:03.198357 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:03:03.198328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 17:03:03.217029 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:03:03.217005 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 17:05:10.474365 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.474327 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7k2bn/must-gather-pxb6g"] Apr 16 17:05:10.474849 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.474631 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09954f09-ba99-4aeb-9317-183fb22fa929" containerName="kserve-container" Apr 16 17:05:10.474849 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.474642 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="09954f09-ba99-4aeb-9317-183fb22fa929" containerName="kserve-container" Apr 16 17:05:10.474849 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.474699 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="09954f09-ba99-4aeb-9317-183fb22fa929" containerName="kserve-container" Apr 16 17:05:10.477523 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.477486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.480196 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.480172 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7k2bn\"/\"kube-root-ca.crt\"" Apr 16 17:05:10.480327 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.480218 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7k2bn\"/\"openshift-service-ca.crt\"" Apr 16 17:05:10.480374 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.480325 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7k2bn\"/\"default-dockercfg-nkv5v\"" Apr 16 17:05:10.484865 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.484596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k2bn/must-gather-pxb6g"] Apr 16 17:05:10.574568 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.574494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9slcj\" (UniqueName: \"kubernetes.io/projected/73cd736a-9293-4c91-95bb-8512db02b9b9-kube-api-access-9slcj\") pod \"must-gather-pxb6g\" (UID: \"73cd736a-9293-4c91-95bb-8512db02b9b9\") " pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.574754 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.574619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73cd736a-9293-4c91-95bb-8512db02b9b9-must-gather-output\") pod \"must-gather-pxb6g\" (UID: \"73cd736a-9293-4c91-95bb-8512db02b9b9\") " pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.675573 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.675503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9slcj\" (UniqueName: \"kubernetes.io/projected/73cd736a-9293-4c91-95bb-8512db02b9b9-kube-api-access-9slcj\") pod \"must-gather-pxb6g\" (UID: \"73cd736a-9293-4c91-95bb-8512db02b9b9\") " pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.675573 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.675586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73cd736a-9293-4c91-95bb-8512db02b9b9-must-gather-output\") pod \"must-gather-pxb6g\" (UID: \"73cd736a-9293-4c91-95bb-8512db02b9b9\") " pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.675894 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.675878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73cd736a-9293-4c91-95bb-8512db02b9b9-must-gather-output\") pod \"must-gather-pxb6g\" (UID: \"73cd736a-9293-4c91-95bb-8512db02b9b9\") " pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.684806 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.684774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9slcj\" (UniqueName: \"kubernetes.io/projected/73cd736a-9293-4c91-95bb-8512db02b9b9-kube-api-access-9slcj\") pod \"must-gather-pxb6g\" (UID: \"73cd736a-9293-4c91-95bb-8512db02b9b9\") " pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.787433 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.787345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k2bn/must-gather-pxb6g" Apr 16 17:05:10.908792 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.908751 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k2bn/must-gather-pxb6g"] Apr 16 17:05:10.911617 ip-10-0-130-130 kubenswrapper[2577]: W0416 17:05:10.911581 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73cd736a_9293_4c91_95bb_8512db02b9b9.slice/crio-ca66e18a96e7e70350de967ee1a23df26dc7598c8afa20e4c663d42b32e19526 WatchSource:0}: Error finding container ca66e18a96e7e70350de967ee1a23df26dc7598c8afa20e4c663d42b32e19526: Status 404 returned error can't find the container with id ca66e18a96e7e70350de967ee1a23df26dc7598c8afa20e4c663d42b32e19526 Apr 16 17:05:10.913214 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:10.913198 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:05:11.372372 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:11.372333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/must-gather-pxb6g" event={"ID":"73cd736a-9293-4c91-95bb-8512db02b9b9","Type":"ContainerStarted","Data":"ca66e18a96e7e70350de967ee1a23df26dc7598c8afa20e4c663d42b32e19526"} Apr 16 17:05:12.377367 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:12.377330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/must-gather-pxb6g" event={"ID":"73cd736a-9293-4c91-95bb-8512db02b9b9","Type":"ContainerStarted","Data":"ce355892f714cb85ac18afc0b15b409cb14ff769b7feae04a5d81c951ec75eed"} Apr 16 17:05:12.377367 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:12.377372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/must-gather-pxb6g" event={"ID":"73cd736a-9293-4c91-95bb-8512db02b9b9","Type":"ContainerStarted","Data":"517db856b6fb6d24760893404bc89b4003ae8eabb52b244c69b0e9ef83629ab1"} Apr 16 17:05:12.393241 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:12.393182 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7k2bn/must-gather-pxb6g" podStartSLOduration=1.409788973 podStartE2EDuration="2.393165556s" podCreationTimestamp="2026-04-16 17:05:10 +0000 UTC" firstStartedPulling="2026-04-16 17:05:10.913324896 +0000 UTC m=+3728.428362971" lastFinishedPulling="2026-04-16 17:05:11.896701479 +0000 UTC m=+3729.411739554" observedRunningTime="2026-04-16 17:05:12.391580146 +0000 UTC m=+3729.906618243" watchObservedRunningTime="2026-04-16 17:05:12.393165556 +0000 UTC m=+3729.908203653" Apr 16 17:05:13.343386 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:13.343348 2577 ???:1] "http: TLS handshake error from 10.0.130.130:49418: EOF" Apr 16 17:05:13.349727 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:13.349691 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m9sr4_b315b40d-dd6d-45a2-aab1-242618785118/global-pull-secret-syncer/0.log" Apr 16 17:05:13.634533 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:13.634489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mpcnj_6b105e59-460b-469d-be97-f4653f502e92/konnectivity-agent/0.log" Apr 16 17:05:13.681554 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:13.681521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-130.ec2.internal_4a6fddb6306d2ef465913be401643923/haproxy/0.log" Apr 16 17:05:16.801736 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:16.801671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-lc5qk_943564a8-f27b-4d3d-950c-923624bfb085/kube-state-metrics/0.log" Apr 16 17:05:16.826243 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:16.826163 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-lc5qk_943564a8-f27b-4d3d-950c-923624bfb085/kube-rbac-proxy-main/0.log" Apr 16 17:05:16.858350 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:16.858309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-lc5qk_943564a8-f27b-4d3d-950c-923624bfb085/kube-rbac-proxy-self/0.log" Apr 16 17:05:17.061832 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:17.061749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8g2p_de882888-cb2e-44b5-b474-0352b65e5a08/node-exporter/0.log" Apr 16 17:05:17.085480 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:17.085446 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8g2p_de882888-cb2e-44b5-b474-0352b65e5a08/kube-rbac-proxy/0.log" Apr 16 17:05:17.106620 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:17.106598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v8g2p_de882888-cb2e-44b5-b474-0352b65e5a08/init-textfile/0.log" Apr 16 17:05:17.220795 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:17.220768 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-2cdf8_6f959bf3-102d-4116-8396-d8ae64211287/kube-rbac-proxy-main/0.log" Apr 16 17:05:17.252743 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:17.252712 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-2cdf8_6f959bf3-102d-4116-8396-d8ae64211287/kube-rbac-proxy-self/0.log" Apr 16 17:05:17.278147 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:17.278116 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-2cdf8_6f959bf3-102d-4116-8396-d8ae64211287/openshift-state-metrics/0.log" Apr 16 17:05:20.884589 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:20.884489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hccnt_b5434c75-d025-4504-969f-7f577a43a937/dns/0.log" Apr 16 17:05:20.905785 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:20.905757 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hccnt_b5434c75-d025-4504-969f-7f577a43a937/kube-rbac-proxy/0.log" Apr 16 17:05:20.969234 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:20.969202 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm"] Apr 16 17:05:20.973756 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:20.973731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:20.981776 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:20.981748 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm"] Apr 16 17:05:21.063414 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.063390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-chvcc_a4144120-204d-44b5-9a92-fb19a8e03118/dns-node-resolver/0.log" Apr 16 17:05:21.067721 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.067699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-sys\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.067835 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.067751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-proc\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.067835 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.067776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-lib-modules\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.067835 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.067811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-podres\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.067835 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.067832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2zk\" (UniqueName: \"kubernetes.io/projected/a2ffeb09-0d01-4177-8f38-a4dc828fb693-kube-api-access-bj2zk\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168414 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-sys\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168414 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-proc\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168615 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-proc\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168615 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-sys\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168615 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-lib-modules\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168615 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-podres\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168615 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2zk\" (UniqueName: \"kubernetes.io/projected/a2ffeb09-0d01-4177-8f38-a4dc828fb693-kube-api-access-bj2zk\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168776 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-podres\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.168776 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.168618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2ffeb09-0d01-4177-8f38-a4dc828fb693-lib-modules\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.176423 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.176400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2zk\" (UniqueName: \"kubernetes.io/projected/a2ffeb09-0d01-4177-8f38-a4dc828fb693-kube-api-access-bj2zk\") pod \"perf-node-gather-daemonset-q2xhm\" (UID: \"a2ffeb09-0d01-4177-8f38-a4dc828fb693\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.286275 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.286243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:21.434338 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.434311 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm"] Apr 16 17:05:21.437341 ip-10-0-130-130 kubenswrapper[2577]: W0416 17:05:21.437313 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda2ffeb09_0d01_4177_8f38_a4dc828fb693.slice/crio-e6244e958d2bafde5df38d12c2762b73c463aa75b2aed3e218584662261c76ae WatchSource:0}: Error finding container e6244e958d2bafde5df38d12c2762b73c463aa75b2aed3e218584662261c76ae: Status 404 returned error can't find the container with id e6244e958d2bafde5df38d12c2762b73c463aa75b2aed3e218584662261c76ae Apr 16 17:05:21.528419 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:21.528392 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7mgkb_28ab87e7-a174-44d1-a00c-16f49134a9b5/node-ca/0.log" Apr 16 17:05:22.419142 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:22.419101 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" event={"ID":"a2ffeb09-0d01-4177-8f38-a4dc828fb693","Type":"ContainerStarted","Data":"017aaac6992cc3699ebbfa404bb334856420d6981fb10180450cb3665f8d4ef3"} Apr 16 17:05:22.419142 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:22.419151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" event={"ID":"a2ffeb09-0d01-4177-8f38-a4dc828fb693","Type":"ContainerStarted","Data":"e6244e958d2bafde5df38d12c2762b73c463aa75b2aed3e218584662261c76ae"} Apr 16 17:05:22.419724 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:22.419193 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:22.435894 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:22.435829 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" podStartSLOduration=2.435809271 podStartE2EDuration="2.435809271s" podCreationTimestamp="2026-04-16 17:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:05:22.434205328 +0000 UTC m=+3739.949243435" watchObservedRunningTime="2026-04-16 17:05:22.435809271 +0000 UTC m=+3739.950847367" Apr 16 17:05:22.562454 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:22.562424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b8qrw_efb959fd-8419-441a-a0a0-4d727132d9de/serve-healthcheck-canary/0.log" Apr 16 17:05:23.026969 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:23.026936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gbzsb_6886fe81-08eb-4098-9867-798c7f5e4257/kube-rbac-proxy/0.log" Apr 16 17:05:23.045888 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:23.045859 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gbzsb_6886fe81-08eb-4098-9867-798c7f5e4257/exporter/0.log" Apr 16 17:05:23.068264 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:23.068235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gbzsb_6886fe81-08eb-4098-9867-798c7f5e4257/extractor/0.log" Apr 16 17:05:25.141277 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:25.141250 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-kj22m_ad9ba298-ec3a-4e1d-b409-0848a1cad1ed/manager/0.log" Apr 16 17:05:25.388547 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:25.388499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6c9kq_9c965d9c-d49d-4edc-977c-7912744b335d/manager/0.log" Apr 16 17:05:25.480783 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:25.480715 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-mzfnp_2587e677-476b-40cf-b78f-23ac06bc584b/seaweedfs/0.log" Apr 16 17:05:25.503722 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:25.503681 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-p954m_87ed86d3-443d-4db0-a511-f793db537a82/seaweedfs-tls-custom/0.log" Apr 16 17:05:28.435359 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:28.435325 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-q2xhm" Apr 16 17:05:30.543034 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.543001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/kube-multus-additional-cni-plugins/0.log" Apr 16 17:05:30.565259 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.565223 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/egress-router-binary-copy/0.log" Apr 16 17:05:30.585500 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.585428 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/cni-plugins/0.log" Apr 16 17:05:30.606430 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.606402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/bond-cni-plugin/0.log" Apr 16 17:05:30.627305 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.627276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/routeoverride-cni/0.log" Apr 16 17:05:30.649680 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.649596 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/whereabouts-cni-bincopy/0.log" Apr 16 17:05:30.671106 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:30.671034 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blk6n_4cc6472e-53b0-4010-9c7e-5fae56b32e00/whereabouts-cni/0.log" Apr 16 17:05:31.069215 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:31.069190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vshnw_5e6c7798-3e64-45d4-88fa-1f044dd7030c/kube-multus/0.log" Apr 16 17:05:31.088569 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:31.088540 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2mqsw_dd5274ed-46c8-46c2-a74e-26859678b08d/network-metrics-daemon/0.log" Apr 16 17:05:31.106927 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:31.106903 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2mqsw_dd5274ed-46c8-46c2-a74e-26859678b08d/kube-rbac-proxy/0.log" Apr 16 17:05:32.573538 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.573438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-controller/0.log" Apr 16 17:05:32.594775 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.594746 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/0.log" Apr 16 17:05:32.613438 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.613398 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovn-acl-logging/1.log" Apr 16 17:05:32.636064 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.636035 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/kube-rbac-proxy-node/0.log" Apr 16 17:05:32.657704 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.657665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:05:32.674915 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.674881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/northd/0.log" Apr 16 17:05:32.697362 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.697334 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/nbdb/0.log" Apr 16 17:05:32.718889 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.718859 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/sbdb/0.log" Apr 16 17:05:32.827680 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:32.827599 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h6xk8_90e6fc2e-70a5-41d5-9a11-ce841bf5eabf/ovnkube-controller/0.log" Apr 16 17:05:33.767336 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:33.767303 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5n846_52215440-c220-46dc-927b-72ff3dad940a/network-check-target-container/0.log" Apr 16 17:05:34.683073 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:34.683041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jf4lx_adb1352d-acea-4d2f-aff2-10575539bfae/iptables-alerter/0.log" Apr 16 17:05:35.329663 ip-10-0-130-130 kubenswrapper[2577]: I0416 17:05:35.329629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6q4mf_14943d57-3e54-4ff5-8849-7eccefbe2aa1/tuned/0.log"