Apr 17 11:13:38.197264 ip-10-0-139-136 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:38.197279 ip-10-0-139-136 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:38.197289 ip-10-0-139-136 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:38.197623 ip-10-0-139-136 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:13:48.206831 ip-10-0-139-136 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:13:48.206846 ip-10-0-139-136 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b05250de66d9431a83ef2e1cdcaebad2 -- Apr 17 11:16:23.743510 ip-10-0-139-136 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:24.244356 ip-10-0-139-136 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:24.244356 ip-10-0-139-136 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:24.244356 ip-10-0-139-136 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:24.244356 ip-10-0-139-136 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:24.244356 ip-10-0-139-136 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:24.245488 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.245400 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:24.248538 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248523 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248540 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248544 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248548 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248551 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248554 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248557 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248560 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248563 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248566 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248569 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248572 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248575 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248577 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248580 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:24.248579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248584 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248588 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248591 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248594 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248596 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248606 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248609 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248612 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248614 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248617 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248619 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248622 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248624 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248627 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248630 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248632 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248635 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248637 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248640 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248642 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:24.249006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248645 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248647 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248650 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248652 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248655 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248657 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248660 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248662 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248665 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248667 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248669 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248672 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248675 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248678 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248681 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248684 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248686 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248689 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248691 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248694 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:24.249516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248702 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248705 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248708 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248711 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248713 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248716 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248718 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248721 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248723 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248726 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248729 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248731 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248734 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248740 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248743 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248746 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248749 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248751 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248753 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:24.250034 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248756 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248762 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248765 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248767 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248770 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248773 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248775 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248778 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248781 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248784 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248786 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.248789 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249165 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249170 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249173 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249176 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249178 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249181 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249184 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249186 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:24.250507 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249189 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249192 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249195 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249197 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249199 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249202 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249205 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249223 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249228 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249232 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249236 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249240 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249242 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249246 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249249 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249251 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249254 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249257 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249259 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249263 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:24.250996 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249266 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249269 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249272 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249275 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249277 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249280 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249283 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249285 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249288 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249290 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249293 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249295 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249298 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249300 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249303 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249305 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249308 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249310 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249313 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249317 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:24.251515 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249319 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249322 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249326 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249329 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249331 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249334 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249338 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249341 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249343 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249346 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249348 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249354 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249357 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249360 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249363 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249366 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249369 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249371 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249374 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:24.252006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249376 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249379 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249381 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249388 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249390 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249393 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249395 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249398 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249400 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249403 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249406 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249408 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249411 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249414 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249416 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249419 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249421 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249424 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.249426 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249504 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249521 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:24.252486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249531 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249537 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249543 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249549 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249554 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249558 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249562 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249565 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249568 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249571 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249575 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249578 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249581 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249584 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249586 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249589 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249593 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249601 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249604 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249607 2571 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249610 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249614 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249618 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249621 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:24.252985 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249625 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249628 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249631 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249634 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249637 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249640 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249643 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249648 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249652 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249655 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249658 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249661 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249664 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249669 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249672 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249675 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249678 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249681 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249685 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249688 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249691 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249694 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249697 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249700 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249703 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:24.253578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249706 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249709 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249712 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249716 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249720 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249723 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249726 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249729 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249732 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249735 2571 flags.go:64] FLAG: --help="false" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249738 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249741 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249744 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249747 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249751 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249754 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249761 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249764 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249767 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249770 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249773 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249776 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249779 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249782 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:24.254177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249785 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249788 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249791 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249794 2571 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249797 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249800 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249803 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249808 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249811 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249814 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249817 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249820 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249824 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249827 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249830 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249835 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249837 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249842 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249845 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249848 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249851 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249854 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249857 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249860 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249863 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:24.254826 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249873 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249876 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249879 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249882 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249885 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249891 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249894 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249897 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249903 2571 flags.go:64] FLAG: --port="10250" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249906 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249909 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01d1e9bc753a92394" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249912 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249915 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249918 2571 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249921 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249924 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249928 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249931 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249934 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249937 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249941 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249944 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249947 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249950 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249953 2571 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:24.255486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249957 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249960 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249963 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249966 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249968 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249971 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249974 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249978 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249981 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249984 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249987 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249990 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249993 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249996 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.249999 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250009 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250012 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250015 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250019 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250021 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250024 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250027 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250030 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250033 2571 flags.go:64] FLAG: --v="2" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250038 2571 flags.go:64] FLAG: --version="false" Apr 17 11:16:24.256080 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250042 2571 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250047 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250050 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250142 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250145 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250148 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250152 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250155 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250157 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250160 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250163 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250165 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250168 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250171 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250173 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250176 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250180 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250184 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250187 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250189 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:24.256689 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250192 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250195 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250199 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250202 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250205 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250221 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250224 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250226 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250229 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250232 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250235 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250238 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250240 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250247 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250249 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250252 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250255 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250257 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250260 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250262 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:24.257176 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250265 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250268 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250270 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250273 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250276 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250278 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250281 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250283 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250286 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250289 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250291 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250294 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250297 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250299 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250303 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250306 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250308 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250311 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250314 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250317 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:24.257715 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250319 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250322 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250324 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250328 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250332 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250336 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250340 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250342 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250345 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250348 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250351 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250353 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250356 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250358 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250361 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250363 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250366 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250369 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250371 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:24.258264 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250374 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250376 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250379 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250382 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250385 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250387 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250390 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250393 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250396 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.250398 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.250410 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.256751 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.256855 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256906 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256911 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256915 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:24.258738 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256918 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256920 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256923 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256926 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256928 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256931 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256934 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256936 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256939 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256941 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256944 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256947 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256950 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256953 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256956 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256958 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256961 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256963 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256966 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256969 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:24.259138 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256971 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256974 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256977 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256979 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256982 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256984 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256987 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256989 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256993 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256996 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.256998 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257001 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257004 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257006 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257009 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257012 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257014 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257017 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257019 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257022 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:24.259652 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257024 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257027 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257029 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257032 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257034 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257037 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257040 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257043 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257045 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257048 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257052 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257056 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257058 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257061 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257063 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257066 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257068 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257071 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257073 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:24.260146 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257076 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257079 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257082 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257085 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257087 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257090 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257093 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257095 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257098 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257100 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257103 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257105 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257108 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257111 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257114 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257118 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257122 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257126 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257129 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:24.260639 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257132 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257135 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257138 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257141 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257144 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.257149 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257266 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257272 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257275 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257277 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257280 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257283 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257286 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257289 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257291 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257294 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:24.261099 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257297 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257300 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257303 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257305 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257308 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257311 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257314 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257316 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257319 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257321 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257324 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257326 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257329 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257332 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257334 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257337 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257339 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257342 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257345 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257347 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:24.261513 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257349 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257352 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257354 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257357 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257360 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257362 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257364 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257367 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257369 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257372 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257374 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257377 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257380 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257384 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257386 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257389 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257391 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257394 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257396 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257399 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:24.262003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257401 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257404 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257406 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257409 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257412 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257414 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257417 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257420 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257423 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257425 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257428 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257430 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257433 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257436 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257438 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257442 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257446 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257449 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257452 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:24.262625 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257454 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257457 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257459 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257462 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257465 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257467 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257470 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257473 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257476 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257478 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257481 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257483 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257486 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257488 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257491 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257495 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:24.263089 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:24.257498 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:24.263502 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.257503 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:24.263502 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.257615 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:24.263502 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.261184 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:24.263502 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.262190 2571 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:24.263502 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.262238 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:24.263502 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.262279 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:24.288060 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.288038 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:24.290773 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.290747 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:24.305101 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.305084 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:24.310610 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.310593 2571 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:24.311710 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.311694 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:24.318231 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.318193 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:24.320447 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.320426 2571 fs.go:135] Filesystem UUIDs: map[1f7a2eb3-f1d5-4219-96d6-ac84786864df:/dev/nvme0n1p4 2e06087a-5669-4f15-8dab-aa87bbe95d3b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 11:16:24.320492 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.320447 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:24.326121 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.326012 2571 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:24.324111216 +0000 UTC m=+0.455887905 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100780 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fccd6715cf9a328b13de94cd64672 SystemUUID:ec2fccd6-715c-f9a3-28b1-3de94cd64672 BootID:b05250de-66d9-431a-83ef-2e1cdcaebad2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:57:5e:12:59:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:57:5e:12:59:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:b7:65:36:1d:8e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:24.326121 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.326116 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:24.326263 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.326251 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:24.327275 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.327249 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:24.327427 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.327277 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-136.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:24.327480 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.327436 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:24.327480 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.327444 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:24.327480 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.327457 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:24.327960 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.327950 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:24.328906 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.328896 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:24.329012 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.329003 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:24.331434 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.331424 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:24.331470 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.331441 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:24.331470 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.331453 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:24.331470 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.331463 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:24.331577 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.331472 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:24.333028 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.333016 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:24.333073 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.333050 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:24.340176 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.340151 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:24.340243 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.340172 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-136.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:24.346360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.346346 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:24.347718 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.347705 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:24.349463 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349453 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349469 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349476 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349481 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349486 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349492 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349498 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:24.349506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349503 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:24.349680 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349509 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:24.349680 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349515 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:24.349680 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349523 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:24.349680 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.349532 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:24.350479 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.350470 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:24.350479 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.350479 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:24.353329 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.353310 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-136.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:24.354032 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.354021 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:24.354063 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.354057 2571 server.go:1295] "Started kubelet" Apr 17 11:16:24.354183 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.354152 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:24.354278 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.354160 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:24.354278 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.354251 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:24.356359 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.356340 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:24.356645 ip-10-0-139-136 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:24.360319 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.360303 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:24.362477 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.362456 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:24.362568 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.362488 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:24.363331 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.363315 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:24.363438 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.363426 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:24.363504 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.363325 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:24.363554 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.363538 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:24.363554 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.363548 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:24.363637 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.363582 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.364621 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364603 2571 factory.go:55] Registering systemd factory Apr 17 11:16:24.364701 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364630 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:24.364847 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364833 2571 factory.go:153] Registering CRI-O factory Apr 17 11:16:24.364891 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364849 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:24.364935 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364900 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:24.364935 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364927 2571 factory.go:103] Registering Raw factory Apr 17 11:16:24.365030 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.364942 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:24.365287 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.365271 2571 manager.go:319] Starting recovery of all containers Apr 17 11:16:24.367504 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.367479 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:24.372431 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.372402 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-136.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 11:16:24.372520 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.372488 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 11:16:24.373382 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.372402 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-136.ec2.internal.18a720bf60b16d46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-136.ec2.internal,UID:ip-10-0-139-136.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-136.ec2.internal,},FirstTimestamp:2026-04-17 11:16:24.354032966 +0000 UTC m=+0.485809643,LastTimestamp:2026-04-17 11:16:24.354032966 +0000 UTC m=+0.485809643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-136.ec2.internal,}" Apr 17 11:16:24.375120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.375103 2571 manager.go:324] Recovery completed Apr 17 11:16:24.378838 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.378820 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:24.381368 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.381353 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:24.381426 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.381380 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:24.381426 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.381392 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:24.381868 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.381853 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:24.381868 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.381866 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:24.381979 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.381881 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:24.383876 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.383863 2571 policy_none.go:49] "None policy: Start" Apr 17 11:16:24.383876 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.383880 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:24.383963 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.383889 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:24.384058 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.383988 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-136.ec2.internal.18a720bf62528661 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-136.ec2.internal,UID:ip-10-0-139-136.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-136.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-136.ec2.internal,},FirstTimestamp:2026-04-17 11:16:24.381367905 +0000 UTC m=+0.513144586,LastTimestamp:2026-04-17 11:16:24.381367905 +0000 UTC m=+0.513144586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-136.ec2.internal,}" Apr 17 11:16:24.392248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.392231 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wjxdd" Apr 17 11:16:24.397688 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.397629 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-136.ec2.internal.18a720bf6252ca08 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-136.ec2.internal,UID:ip-10-0-139-136.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-139-136.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-139-136.ec2.internal,},FirstTimestamp:2026-04-17 11:16:24.381385224 +0000 UTC m=+0.513161903,LastTimestamp:2026-04-17 11:16:24.381385224 +0000 UTC m=+0.513161903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-136.ec2.internal,}" Apr 17 11:16:24.403903 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.403876 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wjxdd" Apr 17 11:16:24.407541 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.407455 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-136.ec2.internal.18a720bf6252f369 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-136.ec2.internal,UID:ip-10-0-139-136.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-139-136.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-139-136.ec2.internal,},FirstTimestamp:2026-04-17 11:16:24.381395817 +0000 UTC m=+0.513172496,LastTimestamp:2026-04-17 11:16:24.381395817 +0000 UTC m=+0.513172496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-136.ec2.internal,}" Apr 17 11:16:24.419830 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.419815 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.419855 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.419870 2571 server.go:85] "Starting device plugin registration server" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.420117 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.420129 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.420240 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.420339 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.420349 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.421008 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:24.432802 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.421050 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.501785 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.501693 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:24.503048 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.503023 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:24.503192 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.503059 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:24.503192 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.503084 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:24.503192 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.503094 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:24.503192 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.503184 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:24.507193 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.507175 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:24.520529 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.520514 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:24.522176 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.522061 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:24.522176 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.522094 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:24.522176 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.522108 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:24.522176 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.522134 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.530531 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.530511 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.530610 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.530534 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-136.ec2.internal\": node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.570409 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.570382 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.604280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.604251 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal"] Apr 17 11:16:24.604406 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.604325 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:24.606903 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.606887 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:24.606995 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.606917 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:24.606995 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.606932 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:24.608071 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608058 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:24.608242 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608226 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.608319 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608262 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:24.608820 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608793 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:24.608820 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608809 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:24.608820 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608822 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:24.609002 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608829 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:24.609002 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608836 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:24.609002 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.608839 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:24.610016 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.610001 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.610110 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.610029 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:24.610653 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.610637 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:24.610653 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.610661 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:24.610798 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.610677 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:24.631846 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.631825 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-136.ec2.internal\" not found" node="ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.636278 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.636262 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-136.ec2.internal\" not found" node="ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.665831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.665806 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.665934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.665842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.665934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.665858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b16f66466a195429f73bc1a0dcec09-config\") pod \"kube-apiserver-proxy-ip-10-0-139-136.ec2.internal\" (UID: \"57b16f66466a195429f73bc1a0dcec09\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.670911 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.670890 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.766034 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.765952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.766034 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.765990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b16f66466a195429f73bc1a0dcec09-config\") pod \"kube-apiserver-proxy-ip-10-0-139-136.ec2.internal\" (UID: \"57b16f66466a195429f73bc1a0dcec09\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.766034 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.766005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.766251 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.766048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.766251 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.766062 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.766251 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.766065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b16f66466a195429f73bc1a0dcec09-config\") pod \"kube-apiserver-proxy-ip-10-0-139-136.ec2.internal\" (UID: \"57b16f66466a195429f73bc1a0dcec09\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.771001 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.770982 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.871782 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.871755 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:24.933965 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.933942 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.938474 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:24.938455 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 17 11:16:24.972625 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:24.972590 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:25.073182 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.073080 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:25.173693 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.173651 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 17 11:16:25.223571 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.223548 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:25.262884 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.262846 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:25.263526 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.262995 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:25.263526 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.263034 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:25.263961 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.263945 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 17 11:16:25.280366 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.280342 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:25.282532 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.282520 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 17 11:16:25.295287 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.295270 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:25.332509 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.332434 2571 apiserver.go:52] "Watching apiserver" Apr 17 11:16:25.339311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.339291 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:25.339605 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.339584 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bq85c","openshift-multus/network-metrics-daemon-fcgzf","openshift-network-diagnostics/network-check-target-82xvr","openshift-network-operator/iptables-alerter-h527p","openshift-ovn-kubernetes/ovnkube-node-nj28v","kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal","openshift-dns/node-resolver-rd6gl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal"] Apr 17 11:16:25.342281 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.342261 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.342327 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.342304 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.342547 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.342510 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:25.343316 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.343297 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:25.343396 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.343364 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:25.344414 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.344399 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.345892 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.345498 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:25.345892 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.345544 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:25.345892 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.345729 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:25.345892 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.345761 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:25.346791 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.345960 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:25.346791 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.345964 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qwnw4\"" Apr 17 11:16:25.347024 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.346934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.347116 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.347075 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:25.347621 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.347493 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:25.347621 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.347497 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:25.347862 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.347838 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7pfnk\"" Apr 17 11:16:25.348521 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.348504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.349676 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.349661 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:25.350990 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.350976 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:25.351188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.351176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:25.351255 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.351177 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:25.351304 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.351276 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:25.351304 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.351283 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:25.352289 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.352272 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rf766\"" Apr 17 11:16:25.352344 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.352324 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-96w8j\"" Apr 17 11:16:25.352450 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.352423 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:25.358486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.358470 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:25.363589 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.363571 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:25.364179 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.364163 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:25.369644 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.369736 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-systemd-units\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.369736 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-slash\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.369736 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-systemd\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.369852 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-etc-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.369852 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369835 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4xf\" (UniqueName: \"kubernetes.io/projected/924b7a18-8af5-4d88-b11b-b9df79a3809c-kube-api-access-5w4xf\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.369930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-host-slash\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.369930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nx9\" (UniqueName: \"kubernetes.io/projected/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-kube-api-access-48nx9\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.369930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-var-lib-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-os-release\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.370061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.370061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.369983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:25.370061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-ovn\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-cni-netd\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-node-log\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99e275f8-856c-4b32-9170-b68141483240-hosts-file\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwhf\" (UniqueName: \"kubernetes.io/projected/1b1c44fb-0716-4e07-9409-72264a348f29-kube-api-access-ccwhf\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-cni-bin\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99e275f8-856c-4b32-9170-b68141483240-tmp-dir\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.370360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-log-socket\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovnkube-config\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-env-overrides\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovnkube-script-lib\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-iptables-alerter-script\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370517 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-kubelet\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovn-node-metrics-cert\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4t2v\" (UniqueName: \"kubernetes.io/projected/45d5b2c3-f6da-4326-a766-68dc042f85ef-kube-api-access-c4t2v\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xmn\" (UniqueName: \"kubernetes.io/projected/99e275f8-856c-4b32-9170-b68141483240-kube-api-access-c9xmn\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-system-cni-dir\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.370649 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-cnibin\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.371061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.370671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-run-netns\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.375424 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.375403 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:25.402802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.402779 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gln7w" Apr 17 11:16:25.406908 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.406864 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:24 +0000 UTC" deadline="2027-12-23 21:59:00.080076092 +0000 UTC" Apr 17 11:16:25.406971 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.406909 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14770h42m34.67317047s" Apr 17 11:16:25.471699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.471699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwhf\" (UniqueName: \"kubernetes.io/projected/1b1c44fb-0716-4e07-9409-72264a348f29-kube-api-access-ccwhf\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-cni-bin\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99e275f8-856c-4b32-9170-b68141483240-tmp-dir\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471781 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-log-socket\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovnkube-config\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-env-overrides\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovnkube-script-lib\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.471930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-iptables-alerter-script\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-log-socket\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.471955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-kubelet\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovn-node-metrics-cert\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4t2v\" (UniqueName: \"kubernetes.io/projected/45d5b2c3-f6da-4326-a766-68dc042f85ef-kube-api-access-c4t2v\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xmn\" (UniqueName: \"kubernetes.io/projected/99e275f8-856c-4b32-9170-b68141483240-kube-api-access-c9xmn\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472080 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-system-cni-dir\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-cnibin\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-run-netns\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-systemd-units\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-slash\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-systemd\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-etc-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4xf\" (UniqueName: \"kubernetes.io/projected/924b7a18-8af5-4d88-b11b-b9df79a3809c-kube-api-access-5w4xf\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-host-slash\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.472392 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48nx9\" (UniqueName: \"kubernetes.io/projected/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-kube-api-access-48nx9\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99e275f8-856c-4b32-9170-b68141483240-tmp-dir\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-var-lib-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-os-release\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472442 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-kubelet\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472476 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-host-slash\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-ovn\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-cni-netd\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-slash\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472603 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovnkube-config\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovnkube-script-lib\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-node-log\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-systemd\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472642 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-etc-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99e275f8-856c-4b32-9170-b68141483240-hosts-file\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-node-log\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-iptables-alerter-script\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472732 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-var-lib-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472400 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472753 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99e275f8-856c-4b32-9170-b68141483240-hosts-file\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-systemd-units\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472836 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-os-release\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.472848 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-system-cni-dir\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-cni-netd\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/924b7a18-8af5-4d88-b11b-b9df79a3809c-cnibin\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-openvswitch\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.472938 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:25.972915125 +0000 UTC m=+2.104691813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.473992 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-run-ovn\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.472989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-run-netns\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.473015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45d5b2c3-f6da-4326-a766-68dc042f85ef-host-cni-bin\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.473115 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.473189 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.473325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45d5b2c3-f6da-4326-a766-68dc042f85ef-env-overrides\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.474819 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.473385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/924b7a18-8af5-4d88-b11b-b9df79a3809c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.475628 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.475610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45d5b2c3-f6da-4326-a766-68dc042f85ef-ovn-node-metrics-cert\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.482150 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.482130 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:25.482150 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.482150 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:25.482373 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.482162 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:25.482373 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.482207 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:25.982193401 +0000 UTC m=+2.113970071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:25.483798 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.483779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nx9\" (UniqueName: \"kubernetes.io/projected/37c3dfdf-e797-46f5-bc3b-3ec9329d15fe-kube-api-access-48nx9\") pod \"iptables-alerter-h527p\" (UID: \"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe\") " pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.483892 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.483812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwhf\" (UniqueName: \"kubernetes.io/projected/1b1c44fb-0716-4e07-9409-72264a348f29-kube-api-access-ccwhf\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.484777 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.484757 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xmn\" (UniqueName: \"kubernetes.io/projected/99e275f8-856c-4b32-9170-b68141483240-kube-api-access-c9xmn\") pod \"node-resolver-rd6gl\" (UID: \"99e275f8-856c-4b32-9170-b68141483240\") " pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.484844 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.484828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4xf\" (UniqueName: \"kubernetes.io/projected/924b7a18-8af5-4d88-b11b-b9df79a3809c-kube-api-access-5w4xf\") pod \"multus-additional-cni-plugins-bq85c\" (UID: \"924b7a18-8af5-4d88-b11b-b9df79a3809c\") " pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.484898 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.484884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4t2v\" (UniqueName: \"kubernetes.io/projected/45d5b2c3-f6da-4326-a766-68dc042f85ef-kube-api-access-c4t2v\") pod \"ovnkube-node-nj28v\" (UID: \"45d5b2c3-f6da-4326-a766-68dc042f85ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.565914 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.565762 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:25.574732 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:25.574696 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b16f66466a195429f73bc1a0dcec09.slice/crio-ee39bdc46e45c1c5a53664305246582f42a7fb3387fdedeacb163698edae2819 WatchSource:0}: Error finding container ee39bdc46e45c1c5a53664305246582f42a7fb3387fdedeacb163698edae2819: Status 404 returned error can't find the container with id ee39bdc46e45c1c5a53664305246582f42a7fb3387fdedeacb163698edae2819 Apr 17 11:16:25.579071 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.579056 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:25.579500 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:25.579481 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc05921670fca4842dd48b5deb56ad8b1.slice/crio-4f2cf4c851cb8a306becb7843a859e2f4d685b296cff217c867270e14938a51b WatchSource:0}: Error finding container 4f2cf4c851cb8a306becb7843a859e2f4d685b296cff217c867270e14938a51b: Status 404 returned error can't find the container with id 4f2cf4c851cb8a306becb7843a859e2f4d685b296cff217c867270e14938a51b Apr 17 11:16:25.669246 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.669192 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bq85c" Apr 17 11:16:25.676051 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.676025 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h527p" Apr 17 11:16:25.676783 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:25.676687 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod924b7a18_8af5_4d88_b11b_b9df79a3809c.slice/crio-1e0744af563688997afc038b4f38b3d31d64c6eff29ee51ceffb41496b0de2c2 WatchSource:0}: Error finding container 1e0744af563688997afc038b4f38b3d31d64c6eff29ee51ceffb41496b0de2c2: Status 404 returned error can't find the container with id 1e0744af563688997afc038b4f38b3d31d64c6eff29ee51ceffb41496b0de2c2 Apr 17 11:16:25.683062 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:25.683041 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37c3dfdf_e797_46f5_bc3b_3ec9329d15fe.slice/crio-104be40b3a9bbe6de73c3c3704c368d2bd8e154894ff7e765ad232c2ca0bd1a8 WatchSource:0}: Error finding container 104be40b3a9bbe6de73c3c3704c368d2bd8e154894ff7e765ad232c2ca0bd1a8: Status 404 returned error can't find the container with id 104be40b3a9bbe6de73c3c3704c368d2bd8e154894ff7e765ad232c2ca0bd1a8 Apr 17 11:16:25.689423 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.689404 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:25.692877 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.692858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rd6gl" Apr 17 11:16:25.695226 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:25.695189 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d5b2c3_f6da_4326_a766_68dc042f85ef.slice/crio-931a54f74d5bce142a8e0fb27038d009f3a91010cd561f92112420eec37a1d9e WatchSource:0}: Error finding container 931a54f74d5bce142a8e0fb27038d009f3a91010cd561f92112420eec37a1d9e: Status 404 returned error can't find the container with id 931a54f74d5bce142a8e0fb27038d009f3a91010cd561f92112420eec37a1d9e Apr 17 11:16:25.700085 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:25.700051 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e275f8_856c_4b32_9170_b68141483240.slice/crio-b583b787153ef460ee7f13c2e9d8c8654499ec48d22abbe158f95cd31cc6c952 WatchSource:0}: Error finding container b583b787153ef460ee7f13c2e9d8c8654499ec48d22abbe158f95cd31cc6c952: Status 404 returned error can't find the container with id b583b787153ef460ee7f13c2e9d8c8654499ec48d22abbe158f95cd31cc6c952 Apr 17 11:16:25.843980 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.843896 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:25.975767 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:25.975729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:25.975940 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.975887 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:25.976002 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:25.975946 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.975928381 +0000 UTC m=+3.107705049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:26.076029 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.075995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:26.076203 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:26.076140 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:26.076203 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:26.076158 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:26.076203 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:26.076170 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:26.076386 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:26.076254 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:27.076235168 +0000 UTC m=+3.208011839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:26.486996 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.486798 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:26.517278 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.517191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h527p" event={"ID":"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe","Type":"ContainerStarted","Data":"104be40b3a9bbe6de73c3c3704c368d2bd8e154894ff7e765ad232c2ca0bd1a8"} Apr 17 11:16:26.522123 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.522093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerStarted","Data":"1e0744af563688997afc038b4f38b3d31d64c6eff29ee51ceffb41496b0de2c2"} Apr 17 11:16:26.523995 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.523952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerStarted","Data":"4f2cf4c851cb8a306becb7843a859e2f4d685b296cff217c867270e14938a51b"} Apr 17 11:16:26.527028 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.527003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" event={"ID":"57b16f66466a195429f73bc1a0dcec09","Type":"ContainerStarted","Data":"ee39bdc46e45c1c5a53664305246582f42a7fb3387fdedeacb163698edae2819"} Apr 17 11:16:26.536902 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.536876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rd6gl" event={"ID":"99e275f8-856c-4b32-9170-b68141483240","Type":"ContainerStarted","Data":"b583b787153ef460ee7f13c2e9d8c8654499ec48d22abbe158f95cd31cc6c952"} Apr 17 11:16:26.539084 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.539061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"931a54f74d5bce142a8e0fb27038d009f3a91010cd561f92112420eec37a1d9e"} Apr 17 11:16:26.989024 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:26.988989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:26.989200 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:26.989147 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:26.989688 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:26.989228 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.989195253 +0000 UTC m=+5.120971933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:27.090659 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:27.090034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:27.090659 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:27.090226 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:27.090659 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:27.090248 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:27.090659 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:27.090262 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:27.090659 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:27.090322 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:29.090302954 +0000 UTC m=+5.222079638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:27.503447 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:27.503410 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:27.503896 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:27.503410 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:27.503896 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:27.503554 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:27.503896 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:27.503611 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:28.204574 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:28.204546 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:29.002849 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:29.002812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:29.003322 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.002947 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:29.003322 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.003014 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:33.002996038 +0000 UTC m=+9.134772704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:29.103930 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:29.103890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:29.104111 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.104039 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:29.104111 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.104062 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:29.104111 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.104074 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:29.104279 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.104137 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:33.104119369 +0000 UTC m=+9.235896039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:29.504244 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:29.504170 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:29.504413 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.504310 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:29.504413 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:29.504170 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:29.504593 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:29.504565 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:30.548638 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:30.548589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" event={"ID":"57b16f66466a195429f73bc1a0dcec09","Type":"ContainerStarted","Data":"f337773a34689d8791f54c68a52f9554f3b8630cddc6658cf2fd747603450885"} Apr 17 11:16:30.563200 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:30.563149 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" podStartSLOduration=5.563135616 podStartE2EDuration="5.563135616s" podCreationTimestamp="2026-04-17 11:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:30.562823361 +0000 UTC m=+6.694600049" watchObservedRunningTime="2026-04-17 11:16:30.563135616 +0000 UTC m=+6.694912302" Apr 17 11:16:31.503896 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.503815 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:31.504058 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.503829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:31.504058 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:31.503948 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:31.504058 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:31.504022 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:31.552198 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.552161 2571 generic.go:358] "Generic (PLEG): container finished" podID="924b7a18-8af5-4d88-b11b-b9df79a3809c" containerID="62abcfb10850eb5e744a857e69f2c46b79c14d778a48469a8cf2a5f53c48d150" exitCode=0 Apr 17 11:16:31.552651 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.552248 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerDied","Data":"62abcfb10850eb5e744a857e69f2c46b79c14d778a48469a8cf2a5f53c48d150"} Apr 17 11:16:31.555494 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.555457 2571 generic.go:358] "Generic (PLEG): container finished" podID="c05921670fca4842dd48b5deb56ad8b1" containerID="2449a6c63532304a55009f9448adff31ec0f7be27282119532078cf97f40503a" exitCode=0 Apr 17 11:16:31.555602 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.555562 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerDied","Data":"2449a6c63532304a55009f9448adff31ec0f7be27282119532078cf97f40503a"} Apr 17 11:16:31.557532 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.557506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rd6gl" event={"ID":"99e275f8-856c-4b32-9170-b68141483240","Type":"ContainerStarted","Data":"c6f461b7b14a5b5ce26dc6e5319d495ad981fb025ab242c0cc227dd4119d700b"} Apr 17 11:16:31.613159 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:31.613102 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rd6gl" podStartSLOduration=2.964037502 podStartE2EDuration="7.613088327s" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="2026-04-17 11:16:25.702329318 +0000 UTC m=+1.834105982" lastFinishedPulling="2026-04-17 11:16:30.351380137 +0000 UTC m=+6.483156807" observedRunningTime="2026-04-17 11:16:31.612740427 +0000 UTC m=+7.744517117" watchObservedRunningTime="2026-04-17 11:16:31.613088327 +0000 UTC m=+7.744865014" Apr 17 11:16:32.562239 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:32.561735 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/0.log" Apr 17 11:16:32.562703 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:32.562283 2571 generic.go:358] "Generic (PLEG): container finished" podID="c05921670fca4842dd48b5deb56ad8b1" containerID="44403f5b9d652eed5bdf6455d535f44917603a38521e55714c50a8067358c1d0" exitCode=1 Apr 17 11:16:32.562703 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:32.562321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerDied","Data":"44403f5b9d652eed5bdf6455d535f44917603a38521e55714c50a8067358c1d0"} Apr 17 11:16:32.563320 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:32.563044 2571 scope.go:117] "RemoveContainer" containerID="44403f5b9d652eed5bdf6455d535f44917603a38521e55714c50a8067358c1d0" Apr 17 11:16:32.564498 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:32.564187 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h527p" event={"ID":"37c3dfdf-e797-46f5-bc3b-3ec9329d15fe","Type":"ContainerStarted","Data":"f3f92128c9e14e55a3cc5dd2dd7ac5b0b8214cb55516a9026e0de8714afc37f9"} Apr 17 11:16:33.031578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.031496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:33.031739 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.031626 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:33.031739 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.031696 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.031677664 +0000 UTC m=+17.163454334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:33.132562 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.132521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:33.132761 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.132679 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:33.132761 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.132695 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:33.132761 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.132704 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:33.132761 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.132756 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.132742806 +0000 UTC m=+17.264519471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:33.503695 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.503663 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:33.503695 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.503674 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:33.503952 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.503851 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:33.503952 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.503848 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:33.567995 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.567959 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/1.log" Apr 17 11:16:33.568550 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.568401 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/0.log" Apr 17 11:16:33.568858 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.568829 2571 generic.go:358] "Generic (PLEG): container finished" podID="c05921670fca4842dd48b5deb56ad8b1" containerID="e790636d5d35f42b21255d6f88db7c2ff45b9614817baba9106bfcf64328a86c" exitCode=1 Apr 17 11:16:33.569104 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.569081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerDied","Data":"e790636d5d35f42b21255d6f88db7c2ff45b9614817baba9106bfcf64328a86c"} Apr 17 11:16:33.569166 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.569133 2571 scope.go:117] "RemoveContainer" containerID="44403f5b9d652eed5bdf6455d535f44917603a38521e55714c50a8067358c1d0" Apr 17 11:16:33.569373 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.569357 2571 scope.go:117] "RemoveContainer" containerID="e790636d5d35f42b21255d6f88db7c2ff45b9614817baba9106bfcf64328a86c" Apr 17 11:16:33.569564 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:33.569536 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_openshift-machine-config-operator(c05921670fca4842dd48b5deb56ad8b1)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" podUID="c05921670fca4842dd48b5deb56ad8b1" Apr 17 11:16:33.590066 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:33.590008 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h527p" podStartSLOduration=4.924807708 podStartE2EDuration="9.589990859s" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="2026-04-17 11:16:25.684541233 +0000 UTC m=+1.816317912" lastFinishedPulling="2026-04-17 11:16:30.349724392 +0000 UTC m=+6.481501063" observedRunningTime="2026-04-17 11:16:32.624922743 +0000 UTC m=+8.756699431" watchObservedRunningTime="2026-04-17 11:16:33.589990859 +0000 UTC m=+9.721767548" Apr 17 11:16:34.570788 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:34.570754 2571 scope.go:117] "RemoveContainer" containerID="e790636d5d35f42b21255d6f88db7c2ff45b9614817baba9106bfcf64328a86c" Apr 17 11:16:34.571377 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:34.570925 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_openshift-machine-config-operator(c05921670fca4842dd48b5deb56ad8b1)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" podUID="c05921670fca4842dd48b5deb56ad8b1" Apr 17 11:16:35.503790 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:35.503751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:35.503961 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:35.503736 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:35.503961 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:35.503893 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:35.504041 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:35.503982 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:37.503780 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:37.503740 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:37.504342 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:37.503753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:37.504342 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:37.503865 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:37.504342 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:37.503955 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:39.503661 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.503484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:39.504088 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.503483 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:39.504088 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:39.503740 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:39.504088 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:39.503808 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:39.580452 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.580420 2571 generic.go:358] "Generic (PLEG): container finished" podID="924b7a18-8af5-4d88-b11b-b9df79a3809c" containerID="44e6033476f1f225ee244eac15c1325a3fcc19a2ee3de3c7f294469281a317ee" exitCode=0 Apr 17 11:16:39.580594 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.580487 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerDied","Data":"44e6033476f1f225ee244eac15c1325a3fcc19a2ee3de3c7f294469281a317ee"} Apr 17 11:16:39.581858 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.581842 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/1.log" Apr 17 11:16:39.584673 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.584652 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"b46c1af3cf9ca5b3af785ac7c2e6cf22eaab35d45ecdb06c7498fa025bfaa840"} Apr 17 11:16:39.584759 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.584679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"5d8157111ae4228cfe98f756866b48cef6a725dfef649eefae303e2468414d66"} Apr 17 11:16:39.584759 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.584689 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"67459e9d4b3040a3c6850a33101e6ff78e198c0c5750705d346e3caaa93b52be"} Apr 17 11:16:39.584759 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.584697 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"9282e4f980a8868392ea8927cdb702a08d89edec257aa973db8f0e3e3393549e"} Apr 17 11:16:39.584759 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.584705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"1298aa18730ac5b56d682e6a53a75fe8394961d5e9081c65341d9b64fe2e2ca0"} Apr 17 11:16:39.584759 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:39.584713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"2c630bcbd5c02046fc1c79bab6ddd499908392a14c5e65d081799bcf29e9c89e"} Apr 17 11:16:40.588182 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:40.588151 2571 generic.go:358] "Generic (PLEG): container finished" podID="924b7a18-8af5-4d88-b11b-b9df79a3809c" containerID="cd00953081ac56078473b1aa8250ba0217ef23cc62bc45466283d916b79a0a8e" exitCode=0 Apr 17 11:16:40.588593 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:40.588199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerDied","Data":"cd00953081ac56078473b1aa8250ba0217ef23cc62bc45466283d916b79a0a8e"} Apr 17 11:16:41.083867 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.083834 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:41.084033 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.083987 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:41.084104 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.084043 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:57.084029409 +0000 UTC m=+33.215806073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:41.184738 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.184703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:41.184885 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.184819 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:41.184885 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.184832 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:41.184885 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.184841 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:41.185042 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.184888 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:57.184876323 +0000 UTC m=+33.316652988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:41.503677 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.503645 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:41.503821 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.503650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:41.503821 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.503747 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:41.503891 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:41.503863 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:41.592242 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.592141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"1facdf059921166ab87a643fd0c0f0f4bf9509bb8639b971f09c59f8b5d6c0c8"} Apr 17 11:16:41.594014 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.593993 2571 generic.go:358] "Generic (PLEG): container finished" podID="924b7a18-8af5-4d88-b11b-b9df79a3809c" containerID="d6d62cdb6700da591cd08065be9921faeb111ac11593ea87132b4819c1fab657" exitCode=0 Apr 17 11:16:41.594014 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:41.594022 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerDied","Data":"d6d62cdb6700da591cd08065be9921faeb111ac11593ea87132b4819c1fab657"} Apr 17 11:16:43.503443 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:43.503415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:43.503915 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:43.503415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:43.503915 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:43.503557 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:43.503915 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:43.503629 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:43.600291 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:43.600257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" event={"ID":"45d5b2c3-f6da-4326-a766-68dc042f85ef","Type":"ContainerStarted","Data":"b3ab9212722b76c933dd80a402958412c3ac3e926ffb9a89d899085974e39856"} Apr 17 11:16:44.605112 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.605072 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:44.605112 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.605115 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:44.605917 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.605129 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:44.614785 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.614746 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gln7w" Apr 17 11:16:44.622398 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.622373 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:44.622532 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.622521 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:16:44.648603 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.648560 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" podStartSLOduration=7.436454795 podStartE2EDuration="20.648545099s" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="2026-04-17 11:16:25.696680579 +0000 UTC m=+1.828457245" lastFinishedPulling="2026-04-17 11:16:38.908770878 +0000 UTC m=+15.040547549" observedRunningTime="2026-04-17 11:16:44.64738614 +0000 UTC m=+20.779162808" watchObservedRunningTime="2026-04-17 11:16:44.648545099 +0000 UTC m=+20.780321780" Apr 17 11:16:44.742898 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.742870 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-wk5dr"] Apr 17 11:16:44.768059 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.768023 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:44.771499 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.771467 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:44.771709 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.771657 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x47ml\"" Apr 17 11:16:44.773281 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.773256 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:44.801432 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.801402 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph"] Apr 17 11:16:44.807068 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.807028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/823b0362-e067-41b2-b7ac-63303987f87e-agent-certs\") pod \"konnectivity-agent-wk5dr\" (UID: \"823b0362-e067-41b2-b7ac-63303987f87e\") " pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:44.807068 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.807064 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/823b0362-e067-41b2-b7ac-63303987f87e-konnectivity-ca\") pod \"konnectivity-agent-wk5dr\" (UID: \"823b0362-e067-41b2-b7ac-63303987f87e\") " pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:44.821675 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.821578 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-7st2w"] Apr 17 11:16:44.850235 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.839989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.850235 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.840383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.855283 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.853970 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.855283 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.854329 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kkrpp\"" Apr 17 11:16:44.855283 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.854532 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.855283 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.854721 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.856237 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.855762 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-j5ljp"] Apr 17 11:16:44.857763 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.856731 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.857763 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.856947 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:44.857763 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.857508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qhhfc\"" Apr 17 11:16:44.879503 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.878838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:44.882094 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.881922 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gq95b\"" Apr 17 11:16:44.882200 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.882187 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.883407 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.882474 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:44.883407 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.882651 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.890786 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.890758 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-q9pfr"] Apr 17 11:16:44.905455 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.905428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q9pfr" Apr 17 11:16:44.907562 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907543 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9be12e9c-2a20-44bd-8808-6135db522a47-tmp\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.907679 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.907679 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-sys-fs\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.907679 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9pk\" (UniqueName: \"kubernetes.io/projected/d325f4f2-5692-4489-9be3-cb5403f6b917-kube-api-access-qn9pk\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:44.907810 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysctl-conf\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.907810 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907709 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9g9\" (UniqueName: \"kubernetes.io/projected/9be12e9c-2a20-44bd-8808-6135db522a47-kube-api-access-9n9g9\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.907810 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-registration-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.907948 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907809 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcj7c\" (UniqueName: \"kubernetes.io/projected/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-kube-api-access-dcj7c\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.907948 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-socket-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.907948 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/823b0362-e067-41b2-b7ac-63303987f87e-agent-certs\") pod \"konnectivity-agent-wk5dr\" (UID: \"823b0362-e067-41b2-b7ac-63303987f87e\") " pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:44.908082 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysctl-d\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908082 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.907984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-run\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908082 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9be12e9c-2a20-44bd-8808-6135db522a47-etc-tuned\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908082 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d325f4f2-5692-4489-9be3-cb5403f6b917-host\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:44.908284 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-modprobe-d\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908284 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-sys\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908284 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-host\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908284 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-etc-selinux\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.908284 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d325f4f2-5692-4489-9be3-cb5403f6b917-serviceca\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:44.908284 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-lib-modules\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908555 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908334 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sv9hn\"" Apr 17 11:16:44.908555 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/823b0362-e067-41b2-b7ac-63303987f87e-konnectivity-ca\") pod \"konnectivity-agent-wk5dr\" (UID: \"823b0362-e067-41b2-b7ac-63303987f87e\") " pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:44.908555 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysconfig\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908555 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-kubernetes\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908555 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-systemd\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908787 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-var-lib-kubelet\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:44.908787 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908594 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:44.908787 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-device-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:44.908916 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.908899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/823b0362-e067-41b2-b7ac-63303987f87e-konnectivity-ca\") pod \"konnectivity-agent-wk5dr\" (UID: \"823b0362-e067-41b2-b7ac-63303987f87e\") " pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:44.911396 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:44.911368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/823b0362-e067-41b2-b7ac-63303987f87e-agent-certs\") pod \"konnectivity-agent-wk5dr\" (UID: \"823b0362-e067-41b2-b7ac-63303987f87e\") " pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:45.009382 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-etc-kubernetes\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.009526 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgpw\" (UniqueName: \"kubernetes.io/projected/5f80200a-a69d-4400-b71b-efebc2ef29c6-kube-api-access-drgpw\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.009526 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysctl-d\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009643 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-run\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009643 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9be12e9c-2a20-44bd-8808-6135db522a47-etc-tuned\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009643 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-netns\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.009643 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysctl-d\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009643 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-run\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-multus-certs\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d325f4f2-5692-4489-9be3-cb5403f6b917-host\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-modprobe-d\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-sys\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-host\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d325f4f2-5692-4489-9be3-cb5403f6b917-host\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-hostroot\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-etc-selinux\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d325f4f2-5692-4489-9be3-cb5403f6b917-serviceca\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.009831 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-sys\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-lib-modules\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysconfig\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-modprobe-d\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-kubernetes\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-systemd\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-kubernetes\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-var-lib-kubelet\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-etc-selinux\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-device-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-system-cni-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-systemd\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010040 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-device-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-cni-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysconfig\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.009831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-host\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-lib-modules\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010072 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-var-lib-kubelet\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.010355 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9be12e9c-2a20-44bd-8808-6135db522a47-tmp\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010114 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-sys-fs\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9pk\" (UniqueName: \"kubernetes.io/projected/d325f4f2-5692-4489-9be3-cb5403f6b917-kube-api-access-qn9pk\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-cni-multus\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-conf-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-sys-fs\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysctl-conf\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-cnibin\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9be12e9c-2a20-44bd-8808-6135db522a47-etc-sysctl-conf\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-socket-dir-parent\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-kubelet\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d325f4f2-5692-4489-9be3-cb5403f6b917-serviceca\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010552 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9g9\" (UniqueName: \"kubernetes.io/projected/9be12e9c-2a20-44bd-8808-6135db522a47-kube-api-access-9n9g9\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-registration-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcj7c\" (UniqueName: \"kubernetes.io/projected/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-kube-api-access-dcj7c\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011160 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010677 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-os-release\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010701 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-k8s-cni-cncf-io\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-socket-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-daemon-config\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f80200a-a69d-4400-b71b-efebc2ef29c6-cni-binary-copy\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-registration-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-cni-bin\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.011856 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.010975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-socket-dir\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.021447 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.021421 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9pk\" (UniqueName: \"kubernetes.io/projected/d325f4f2-5692-4489-9be3-cb5403f6b917-kube-api-access-qn9pk\") pod \"node-ca-j5ljp\" (UID: \"d325f4f2-5692-4489-9be3-cb5403f6b917\") " pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.021773 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.021753 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcj7c\" (UniqueName: \"kubernetes.io/projected/e5b84fe7-4fc3-46b2-9802-99ab5fca04cf-kube-api-access-dcj7c\") pod \"aws-ebs-csi-driver-node-dgfph\" (UID: \"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.023225 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.023190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9be12e9c-2a20-44bd-8808-6135db522a47-etc-tuned\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.023317 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.023266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9be12e9c-2a20-44bd-8808-6135db522a47-tmp\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.025139 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.025122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9g9\" (UniqueName: \"kubernetes.io/projected/9be12e9c-2a20-44bd-8808-6135db522a47-kube-api-access-9n9g9\") pod \"tuned-7st2w\" (UID: \"9be12e9c-2a20-44bd-8808-6135db522a47\") " pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.094136 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.094103 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:45.101135 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:45.101110 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod823b0362_e067_41b2_b7ac_63303987f87e.slice/crio-37fb2c6e5acc7775dc8a1e157c52e97bf95d0ebb3de7f3cccfdbca8caf35ff9e WatchSource:0}: Error finding container 37fb2c6e5acc7775dc8a1e157c52e97bf95d0ebb3de7f3cccfdbca8caf35ff9e: Status 404 returned error can't find the container with id 37fb2c6e5acc7775dc8a1e157c52e97bf95d0ebb3de7f3cccfdbca8caf35ff9e Apr 17 11:16:45.111492 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-system-cni-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111492 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-cni-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-cni-multus\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-conf-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-conf-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111575 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-system-cni-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111577 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-cni-multus\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-cnibin\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-cnibin\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-cni-dir\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111642 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-socket-dir-parent\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-kubelet\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-os-release\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-k8s-cni-cncf-io\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-socket-dir-parent\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-daemon-config\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111737 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-kubelet\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111748 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-k8s-cni-cncf-io\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f80200a-a69d-4400-b71b-efebc2ef29c6-cni-binary-copy\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111775 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-os-release\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111802 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-cni-bin\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-etc-kubernetes\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drgpw\" (UniqueName: \"kubernetes.io/projected/5f80200a-a69d-4400-b71b-efebc2ef29c6-kube-api-access-drgpw\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-netns\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-var-lib-cni-bin\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-etc-kubernetes\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-multus-certs\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-multus-certs\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.111976 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-hostroot\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.112514 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-hostroot\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.112514 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.111963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f80200a-a69d-4400-b71b-efebc2ef29c6-host-run-netns\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.112514 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.112206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f80200a-a69d-4400-b71b-efebc2ef29c6-multus-daemon-config\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.112514 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.112286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f80200a-a69d-4400-b71b-efebc2ef29c6-cni-binary-copy\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.125881 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.125857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgpw\" (UniqueName: \"kubernetes.io/projected/5f80200a-a69d-4400-b71b-efebc2ef29c6-kube-api-access-drgpw\") pod \"multus-q9pfr\" (UID: \"5f80200a-a69d-4400-b71b-efebc2ef29c6\") " pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.155935 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.155756 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7st2w" Apr 17 11:16:45.164651 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.164629 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" Apr 17 11:16:45.192507 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.192456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5ljp" Apr 17 11:16:45.219485 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.219457 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q9pfr" Apr 17 11:16:45.504360 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.504326 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:45.504533 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.504380 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:45.504533 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:45.504475 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:45.504844 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.504820 2571 scope.go:117] "RemoveContainer" containerID="e790636d5d35f42b21255d6f88db7c2ff45b9614817baba9106bfcf64328a86c" Apr 17 11:16:45.504960 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:45.504841 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:45.606119 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.606084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wk5dr" event={"ID":"823b0362-e067-41b2-b7ac-63303987f87e","Type":"ContainerStarted","Data":"37fb2c6e5acc7775dc8a1e157c52e97bf95d0ebb3de7f3cccfdbca8caf35ff9e"} Apr 17 11:16:45.624206 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.624166 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:44 +0000 UTC" deadline="2027-10-01 22:47:21.04010402 +0000 UTC" Apr 17 11:16:45.624339 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.624195 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12779h30m35.41591321s" Apr 17 11:16:45.785559 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.785478 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-82xvr"] Apr 17 11:16:45.785710 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.785603 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:45.785765 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:45.785705 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:45.789408 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.789381 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fcgzf"] Apr 17 11:16:45.789522 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:45.789480 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:45.789611 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:45.789585 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:46.625017 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:46.624971 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:44 +0000 UTC" deadline="2027-11-14 13:26:30.524778941 +0000 UTC" Apr 17 11:16:46.625017 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:46.625010 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13826h9m43.899773403s" Apr 17 11:16:47.317462 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:47.317433 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f80200a_a69d_4400_b71b_efebc2ef29c6.slice/crio-1dd6d4468ee90911ba5dba76ab5b8f974bb10eee489c2bec11035a52bc81ecea WatchSource:0}: Error finding container 1dd6d4468ee90911ba5dba76ab5b8f974bb10eee489c2bec11035a52bc81ecea: Status 404 returned error can't find the container with id 1dd6d4468ee90911ba5dba76ab5b8f974bb10eee489c2bec11035a52bc81ecea Apr 17 11:16:47.319033 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:47.318920 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5b84fe7_4fc3_46b2_9802_99ab5fca04cf.slice/crio-756f2845bfb021a43f2e5458b9012f1df556cebeac76b042bd5710f0e5188891 WatchSource:0}: Error finding container 756f2845bfb021a43f2e5458b9012f1df556cebeac76b042bd5710f0e5188891: Status 404 returned error can't find the container with id 756f2845bfb021a43f2e5458b9012f1df556cebeac76b042bd5710f0e5188891 Apr 17 11:16:47.342722 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:47.342692 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be12e9c_2a20_44bd_8808_6135db522a47.slice/crio-ee01ab8ed4cbf754c25a0db72f426830f6fdf0a929899801fd1fbc3385da55c9 WatchSource:0}: Error finding container ee01ab8ed4cbf754c25a0db72f426830f6fdf0a929899801fd1fbc3385da55c9: Status 404 returned error can't find the container with id ee01ab8ed4cbf754c25a0db72f426830f6fdf0a929899801fd1fbc3385da55c9 Apr 17 11:16:47.358229 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:16:47.358186 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd325f4f2_5692_4489_9be3_cb5403f6b917.slice/crio-22e6ffe224e5c303c306ca7c2665a177f0a736bb74d865425b72268f9d2fc539 WatchSource:0}: Error finding container 22e6ffe224e5c303c306ca7c2665a177f0a736bb74d865425b72268f9d2fc539: Status 404 returned error can't find the container with id 22e6ffe224e5c303c306ca7c2665a177f0a736bb74d865425b72268f9d2fc539 Apr 17 11:16:47.503287 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.503261 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:47.503406 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.503260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:47.503479 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:47.503400 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:47.503479 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:47.503467 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:47.611903 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.611872 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/1.log" Apr 17 11:16:47.612345 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.612315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerStarted","Data":"b5a18295108168da0bf2566f3744b9e8f2f7b52e739c61c4dcb694d8a89bf0e8"} Apr 17 11:16:47.613765 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.613742 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5ljp" event={"ID":"d325f4f2-5692-4489-9be3-cb5403f6b917","Type":"ContainerStarted","Data":"22e6ffe224e5c303c306ca7c2665a177f0a736bb74d865425b72268f9d2fc539"} Apr 17 11:16:47.614776 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.614751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7st2w" event={"ID":"9be12e9c-2a20-44bd-8808-6135db522a47","Type":"ContainerStarted","Data":"ee01ab8ed4cbf754c25a0db72f426830f6fdf0a929899801fd1fbc3385da55c9"} Apr 17 11:16:47.615885 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.615864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" event={"ID":"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf","Type":"ContainerStarted","Data":"756f2845bfb021a43f2e5458b9012f1df556cebeac76b042bd5710f0e5188891"} Apr 17 11:16:47.617184 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:47.617163 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q9pfr" event={"ID":"5f80200a-a69d-4400-b71b-efebc2ef29c6","Type":"ContainerStarted","Data":"1dd6d4468ee90911ba5dba76ab5b8f974bb10eee489c2bec11035a52bc81ecea"} Apr 17 11:16:48.624239 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:48.623931 2571 generic.go:358] "Generic (PLEG): container finished" podID="924b7a18-8af5-4d88-b11b-b9df79a3809c" containerID="c512141e172c0bf469a6ca6f3a2eeab02ab11747f3a387fa1af6debf6b191fea" exitCode=0 Apr 17 11:16:48.624239 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:48.624224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerDied","Data":"c512141e172c0bf469a6ca6f3a2eeab02ab11747f3a387fa1af6debf6b191fea"} Apr 17 11:16:48.650591 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:48.649799 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" podStartSLOduration=23.649777937 podStartE2EDuration="23.649777937s" podCreationTimestamp="2026-04-17 11:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:47.629698246 +0000 UTC m=+23.761474936" watchObservedRunningTime="2026-04-17 11:16:48.649777937 +0000 UTC m=+24.781554626" Apr 17 11:16:49.503449 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:49.503414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:49.503658 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:49.503414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:49.503658 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:49.503546 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:49.503658 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:49.503645 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:51.504369 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:51.504332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:51.504981 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:51.504332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:51.504981 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:51.504478 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:51.504981 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:51.504569 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:52.632996 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.632751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wk5dr" event={"ID":"823b0362-e067-41b2-b7ac-63303987f87e","Type":"ContainerStarted","Data":"be25c765873e0a249254708b75294643807325273c81ff04cc948787dee7a573"} Apr 17 11:16:52.634144 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.634117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5ljp" event={"ID":"d325f4f2-5692-4489-9be3-cb5403f6b917","Type":"ContainerStarted","Data":"6cfcf737ec764278e454bd6d3b182d27d1a392a90c4099de4fb3834271fe3957"} Apr 17 11:16:52.637380 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.637351 2571 generic.go:358] "Generic (PLEG): container finished" podID="924b7a18-8af5-4d88-b11b-b9df79a3809c" containerID="cf862a2209f9f6fd909077159a9f46286b2a6978236e5150e96b621fe2a0efc3" exitCode=0 Apr 17 11:16:52.637507 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.637438 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerDied","Data":"cf862a2209f9f6fd909077159a9f46286b2a6978236e5150e96b621fe2a0efc3"} Apr 17 11:16:52.638900 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.638875 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7st2w" event={"ID":"9be12e9c-2a20-44bd-8808-6135db522a47","Type":"ContainerStarted","Data":"43c1a893f554bac4b200b02c9d3a4d3188ac5e9f64321ffe5a9dfaec8e316a79"} Apr 17 11:16:52.649872 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.649833 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wk5dr" podStartSLOduration=1.389490308 podStartE2EDuration="8.649821621s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:45.102575973 +0000 UTC m=+21.234352638" lastFinishedPulling="2026-04-17 11:16:52.362907272 +0000 UTC m=+28.494683951" observedRunningTime="2026-04-17 11:16:52.649333753 +0000 UTC m=+28.781110442" watchObservedRunningTime="2026-04-17 11:16:52.649821621 +0000 UTC m=+28.781598304" Apr 17 11:16:52.712769 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.712725 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j5ljp" podStartSLOduration=3.7090165170000002 podStartE2EDuration="8.712709125s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.36108813 +0000 UTC m=+23.492864804" lastFinishedPulling="2026-04-17 11:16:52.364780742 +0000 UTC m=+28.496557412" observedRunningTime="2026-04-17 11:16:52.712664848 +0000 UTC m=+28.844441536" watchObservedRunningTime="2026-04-17 11:16:52.712709125 +0000 UTC m=+28.844485812" Apr 17 11:16:52.713001 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:52.712980 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7st2w" podStartSLOduration=3.6726592179999997 podStartE2EDuration="8.712975912s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.34401128 +0000 UTC m=+23.475787946" lastFinishedPulling="2026-04-17 11:16:52.384327974 +0000 UTC m=+28.516104640" observedRunningTime="2026-04-17 11:16:52.695963605 +0000 UTC m=+28.827740292" watchObservedRunningTime="2026-04-17 11:16:52.712975912 +0000 UTC m=+28.844752601" Apr 17 11:16:53.503763 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:53.503727 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:53.503912 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:53.503727 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:53.503912 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:53.503870 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:53.504072 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:53.503962 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:53.645093 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:53.643518 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" event={"ID":"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf","Type":"ContainerStarted","Data":"9d40bd65d86f4a201e6f55a2af264b446f00ea1289664e763daa2b5469f3358c"} Apr 17 11:16:53.669571 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:53.669541 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:54.434407 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:54.434290 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:53.669565003Z","UUID":"eeaf8420-19cc-4da7-a6fe-71b6fb6c2050","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:54.436771 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:54.436743 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:54.436911 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:54.436780 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:54.647557 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:54.647512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" event={"ID":"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf","Type":"ContainerStarted","Data":"2407e8408e4eb7b5e83d1e8f971bc8a2fb66e75f79f62b32d98887a6dbd4271c"} Apr 17 11:16:55.095058 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:55.095029 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:55.095716 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:55.095690 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:16:55.503727 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:55.503694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:55.503911 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:55.503694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:55.503911 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:55.503827 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:55.503911 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:55.503898 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:57.103582 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:57.103541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:57.104152 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.103730 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:57.104152 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.103825 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs podName:1b1c44fb-0716-4e07-9409-72264a348f29 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:29.103804393 +0000 UTC m=+65.235581072 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs") pod "network-metrics-daemon-fcgzf" (UID: "1b1c44fb-0716-4e07-9409-72264a348f29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:57.204549 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:57.204508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:57.204745 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.204697 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:57.204745 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.204721 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:57.204745 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.204736 2571 projected.go:194] Error preparing data for projected volume kube-api-access-dmbfl for pod openshift-network-diagnostics/network-check-target-82xvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:57.204900 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.204805 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl podName:97b2b03f-63f9-420a-8e36-4e191f507077 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:29.204783648 +0000 UTC m=+65.336560328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmbfl" (UniqueName: "kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl") pod "network-check-target-82xvr" (UID: "97b2b03f-63f9-420a-8e36-4e191f507077") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:57.503362 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:57.503318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:57.503547 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:57.503318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:57.503547 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.503459 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:57.503547 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:57.503508 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:58.656193 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:58.656120 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q9pfr" event={"ID":"5f80200a-a69d-4400-b71b-efebc2ef29c6","Type":"ContainerStarted","Data":"26ed0bcad34c199fb3e3fe351f062e40b2161a77b6c6565a3d6416f5b5d575ee"} Apr 17 11:16:58.688226 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:58.688163 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q9pfr" podStartSLOduration=3.8842277530000002 podStartE2EDuration="14.688148178s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.319258991 +0000 UTC m=+23.451035660" lastFinishedPulling="2026-04-17 11:16:58.12317942 +0000 UTC m=+34.254956085" observedRunningTime="2026-04-17 11:16:58.688049635 +0000 UTC m=+34.819826323" watchObservedRunningTime="2026-04-17 11:16:58.688148178 +0000 UTC m=+34.819924911" Apr 17 11:16:59.503999 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.503968 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:16:59.504177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.503967 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:16:59.504177 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:59.504077 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcgzf" podUID="1b1c44fb-0716-4e07-9409-72264a348f29" Apr 17 11:16:59.504177 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:16:59.504135 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-82xvr" podUID="97b2b03f-63f9-420a-8e36-4e191f507077" Apr 17 11:16:59.654317 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.654287 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeReady" Apr 17 11:16:59.654473 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.654415 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:59.661914 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.661886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bq85c" event={"ID":"924b7a18-8af5-4d88-b11b-b9df79a3809c","Type":"ContainerStarted","Data":"45a40bce5dd92e889cc93272b9099bd274d4c35401e651b577c772bfed0d3bd8"} Apr 17 11:16:59.663781 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.663757 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" event={"ID":"e5b84fe7-4fc3-46b2-9802-99ab5fca04cf","Type":"ContainerStarted","Data":"27a3635188f056c6da4d584900ffecbc7f1e52658549f5035b6539e84459d259"} Apr 17 11:16:59.691837 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.691789 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bq85c" podStartSLOduration=2.733027811 podStartE2EDuration="35.691775829s" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="2026-04-17 11:16:25.679724945 +0000 UTC m=+1.811501625" lastFinishedPulling="2026-04-17 11:16:58.638472978 +0000 UTC m=+34.770249643" observedRunningTime="2026-04-17 11:16:59.691195591 +0000 UTC m=+35.822972280" watchObservedRunningTime="2026-04-17 11:16:59.691775829 +0000 UTC m=+35.823552515" Apr 17 11:16:59.699555 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.699523 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tjbc7"] Apr 17 11:16:59.703336 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.703311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.704617 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.704600 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5m4mb"] Apr 17 11:16:59.706641 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.706625 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:59.707137 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.707122 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lbntp\"" Apr 17 11:16:59.707335 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.707314 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:59.707520 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.707505 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:59.707788 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.707772 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.711607 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.711587 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qr57f\"" Apr 17 11:16:59.711922 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.711905 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:16:59.712014 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.711910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:16:59.712175 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.712160 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:16:59.712248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.712161 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:16:59.715953 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.715935 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tjbc7"] Apr 17 11:16:59.718120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.718103 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5m4mb"] Apr 17 11:16:59.722730 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.722688 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dgfph" podStartSLOduration=4.016873063 podStartE2EDuration="15.722675094s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.320628394 +0000 UTC m=+23.452405059" lastFinishedPulling="2026-04-17 11:16:59.026430425 +0000 UTC m=+35.158207090" observedRunningTime="2026-04-17 11:16:59.721830293 +0000 UTC m=+35.853607006" watchObservedRunningTime="2026-04-17 11:16:59.722675094 +0000 UTC m=+35.854451784" Apr 17 11:16:59.734518 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.734494 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7t99h"] Apr 17 11:16:59.737750 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.737731 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.742595 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.742579 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:59.742802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.742786 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:59.742802 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.742795 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8bg7\"" Apr 17 11:16:59.750536 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.750517 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7t99h"] Apr 17 11:16:59.824490 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3064d784-d81b-4df1-ad85-09f7ec7037db-metrics-tls\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.824490 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3064d784-d81b-4df1-ad85-09f7ec7037db-tmp-dir\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.824490 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64efb360-ec45-436d-87b0-6cd63e034c78-crio-socket\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.824699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824508 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64efb360-ec45-436d-87b0-6cd63e034c78-data-volume\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.824699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/586b9443-fc3c-482e-be48-5fd2e6e6cbe4-cert\") pod \"ingress-canary-tjbc7\" (UID: \"586b9443-fc3c-482e-be48-5fd2e6e6cbe4\") " pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.824699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3064d784-d81b-4df1-ad85-09f7ec7037db-config-volume\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.824699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wl4\" (UniqueName: \"kubernetes.io/projected/586b9443-fc3c-482e-be48-5fd2e6e6cbe4-kube-api-access-p4wl4\") pod \"ingress-canary-tjbc7\" (UID: \"586b9443-fc3c-482e-be48-5fd2e6e6cbe4\") " pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.824699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zh5q\" (UniqueName: \"kubernetes.io/projected/3064d784-d81b-4df1-ad85-09f7ec7037db-kube-api-access-2zh5q\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.824699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64efb360-ec45-436d-87b0-6cd63e034c78-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.824873 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csf22\" (UniqueName: \"kubernetes.io/projected/64efb360-ec45-436d-87b0-6cd63e034c78-kube-api-access-csf22\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.824873 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.824725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64efb360-ec45-436d-87b0-6cd63e034c78-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.925849 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.925801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3064d784-d81b-4df1-ad85-09f7ec7037db-metrics-tls\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.926007 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.925882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3064d784-d81b-4df1-ad85-09f7ec7037db-tmp-dir\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.926007 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.925909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64efb360-ec45-436d-87b0-6cd63e034c78-crio-socket\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926007 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.925932 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64efb360-ec45-436d-87b0-6cd63e034c78-data-volume\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926007 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.925963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/586b9443-fc3c-482e-be48-5fd2e6e6cbe4-cert\") pod \"ingress-canary-tjbc7\" (UID: \"586b9443-fc3c-482e-be48-5fd2e6e6cbe4\") " pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.926007 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.925990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3064d784-d81b-4df1-ad85-09f7ec7037db-config-volume\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.926286 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4wl4\" (UniqueName: \"kubernetes.io/projected/586b9443-fc3c-482e-be48-5fd2e6e6cbe4-kube-api-access-p4wl4\") pod \"ingress-canary-tjbc7\" (UID: \"586b9443-fc3c-482e-be48-5fd2e6e6cbe4\") " pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.926286 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926036 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64efb360-ec45-436d-87b0-6cd63e034c78-crio-socket\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926286 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zh5q\" (UniqueName: \"kubernetes.io/projected/3064d784-d81b-4df1-ad85-09f7ec7037db-kube-api-access-2zh5q\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.926286 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926104 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64efb360-ec45-436d-87b0-6cd63e034c78-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926286 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csf22\" (UniqueName: \"kubernetes.io/projected/64efb360-ec45-436d-87b0-6cd63e034c78-kube-api-access-csf22\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926286 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926168 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64efb360-ec45-436d-87b0-6cd63e034c78-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926545 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64efb360-ec45-436d-87b0-6cd63e034c78-data-volume\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.926545 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3064d784-d81b-4df1-ad85-09f7ec7037db-tmp-dir\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.926697 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3064d784-d81b-4df1-ad85-09f7ec7037db-config-volume\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.926803 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.926705 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64efb360-ec45-436d-87b0-6cd63e034c78-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.929937 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.929918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3064d784-d81b-4df1-ad85-09f7ec7037db-metrics-tls\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:16:59.930015 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.929947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64efb360-ec45-436d-87b0-6cd63e034c78-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.930591 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.930569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/586b9443-fc3c-482e-be48-5fd2e6e6cbe4-cert\") pod \"ingress-canary-tjbc7\" (UID: \"586b9443-fc3c-482e-be48-5fd2e6e6cbe4\") " pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.936742 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.936717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4wl4\" (UniqueName: \"kubernetes.io/projected/586b9443-fc3c-482e-be48-5fd2e6e6cbe4-kube-api-access-p4wl4\") pod \"ingress-canary-tjbc7\" (UID: \"586b9443-fc3c-482e-be48-5fd2e6e6cbe4\") " pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:16:59.936846 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.936815 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csf22\" (UniqueName: \"kubernetes.io/projected/64efb360-ec45-436d-87b0-6cd63e034c78-kube-api-access-csf22\") pod \"insights-runtime-extractor-5m4mb\" (UID: \"64efb360-ec45-436d-87b0-6cd63e034c78\") " pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:16:59.936912 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:16:59.936873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zh5q\" (UniqueName: \"kubernetes.io/projected/3064d784-d81b-4df1-ad85-09f7ec7037db-kube-api-access-2zh5q\") pod \"dns-default-7t99h\" (UID: \"3064d784-d81b-4df1-ad85-09f7ec7037db\") " pod="openshift-dns/dns-default-7t99h" Apr 17 11:17:00.014010 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.013960 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjbc7" Apr 17 11:17:00.019758 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.019730 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5m4mb" Apr 17 11:17:00.045525 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.045499 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7t99h" Apr 17 11:17:00.190712 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.190671 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tjbc7"] Apr 17 11:17:00.194579 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:00.194553 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod586b9443_fc3c_482e_be48_5fd2e6e6cbe4.slice/crio-bc6f003a8907c164f5446dd0f8a02a63e2522d19f3b6341e0b01f1d6084bcfef WatchSource:0}: Error finding container bc6f003a8907c164f5446dd0f8a02a63e2522d19f3b6341e0b01f1d6084bcfef: Status 404 returned error can't find the container with id bc6f003a8907c164f5446dd0f8a02a63e2522d19f3b6341e0b01f1d6084bcfef Apr 17 11:17:00.403744 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.403639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7t99h"] Apr 17 11:17:00.404185 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.404159 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5m4mb"] Apr 17 11:17:00.406995 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:00.406963 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64efb360_ec45_436d_87b0_6cd63e034c78.slice/crio-a58efdf6eed8fddbadc8598b15fb3c71be29881a2b7528130bae37294286a859 WatchSource:0}: Error finding container a58efdf6eed8fddbadc8598b15fb3c71be29881a2b7528130bae37294286a859: Status 404 returned error can't find the container with id a58efdf6eed8fddbadc8598b15fb3c71be29881a2b7528130bae37294286a859 Apr 17 11:17:00.407624 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:00.407596 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3064d784_d81b_4df1_ad85_09f7ec7037db.slice/crio-eb7cf4af3d0860c39fbc52e5991d8b69c46e4fafa6f95a873b880ab1cb654d81 WatchSource:0}: Error finding container eb7cf4af3d0860c39fbc52e5991d8b69c46e4fafa6f95a873b880ab1cb654d81: Status 404 returned error can't find the container with id eb7cf4af3d0860c39fbc52e5991d8b69c46e4fafa6f95a873b880ab1cb654d81 Apr 17 11:17:00.667247 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.667153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tjbc7" event={"ID":"586b9443-fc3c-482e-be48-5fd2e6e6cbe4","Type":"ContainerStarted","Data":"bc6f003a8907c164f5446dd0f8a02a63e2522d19f3b6341e0b01f1d6084bcfef"} Apr 17 11:17:00.668324 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.668295 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7t99h" event={"ID":"3064d784-d81b-4df1-ad85-09f7ec7037db","Type":"ContainerStarted","Data":"eb7cf4af3d0860c39fbc52e5991d8b69c46e4fafa6f95a873b880ab1cb654d81"} Apr 17 11:17:00.669680 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.669651 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5m4mb" event={"ID":"64efb360-ec45-436d-87b0-6cd63e034c78","Type":"ContainerStarted","Data":"ddfdf7543e5f9d93334c1ac1b6e8c09ce4c0e86268cd29dc420ed2e40c56d659"} Apr 17 11:17:00.669797 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:00.669689 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5m4mb" event={"ID":"64efb360-ec45-436d-87b0-6cd63e034c78","Type":"ContainerStarted","Data":"a58efdf6eed8fddbadc8598b15fb3c71be29881a2b7528130bae37294286a859"} Apr 17 11:17:01.503376 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.503332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:17:01.503553 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.503332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:17:01.507697 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.507671 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:01.508042 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.508008 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:01.508195 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.508039 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-98wmj\"" Apr 17 11:17:01.508195 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.508043 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:01.508195 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.508149 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x2vcd\"" Apr 17 11:17:01.673079 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:01.673043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5m4mb" event={"ID":"64efb360-ec45-436d-87b0-6cd63e034c78","Type":"ContainerStarted","Data":"20f7dc3725fe12ded4d1584f305d460729495c42b494203fac54f351a35c7c1e"} Apr 17 11:17:02.678204 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:02.678165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7t99h" event={"ID":"3064d784-d81b-4df1-ad85-09f7ec7037db","Type":"ContainerStarted","Data":"98a41ad2aa65eb5f4f2f7bcde07bb5c7ef02bb881676a45cae20794bd00fe211"} Apr 17 11:17:02.678204 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:02.678206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7t99h" event={"ID":"3064d784-d81b-4df1-ad85-09f7ec7037db","Type":"ContainerStarted","Data":"e667069d072b9c7079a0cc8b3aa2522838e860351b66e53417f25a372b3db5b9"} Apr 17 11:17:02.678704 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:02.678263 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7t99h" Apr 17 11:17:02.679736 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:02.679704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tjbc7" event={"ID":"586b9443-fc3c-482e-be48-5fd2e6e6cbe4","Type":"ContainerStarted","Data":"67054b018dda0a5564c94a716d7819a8124a38c76acd64ab0eaae9df1939edff"} Apr 17 11:17:02.702897 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:02.702795 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7t99h" podStartSLOduration=1.9214534890000001 podStartE2EDuration="3.702778568s" podCreationTimestamp="2026-04-17 11:16:59 +0000 UTC" firstStartedPulling="2026-04-17 11:17:00.409382886 +0000 UTC m=+36.541159552" lastFinishedPulling="2026-04-17 11:17:02.190707963 +0000 UTC m=+38.322484631" observedRunningTime="2026-04-17 11:17:02.702271422 +0000 UTC m=+38.834048109" watchObservedRunningTime="2026-04-17 11:17:02.702778568 +0000 UTC m=+38.834555255" Apr 17 11:17:02.728337 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:02.728290 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tjbc7" podStartSLOduration=1.7308773130000001 podStartE2EDuration="3.728271729s" podCreationTimestamp="2026-04-17 11:16:59 +0000 UTC" firstStartedPulling="2026-04-17 11:17:00.19645641 +0000 UTC m=+36.328233075" lastFinishedPulling="2026-04-17 11:17:02.193850822 +0000 UTC m=+38.325627491" observedRunningTime="2026-04-17 11:17:02.727894213 +0000 UTC m=+38.859670901" watchObservedRunningTime="2026-04-17 11:17:02.728271729 +0000 UTC m=+38.860048416" Apr 17 11:17:03.029620 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.029538 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv"] Apr 17 11:17:03.033859 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.033830 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.037113 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.036792 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 11:17:03.037113 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.036832 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:17:03.037113 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.036851 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:17:03.037113 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.036887 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:17:03.037842 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.037821 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:17:03.037914 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.037873 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5xsqw\"" Apr 17 11:17:03.043114 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.043093 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv"] Apr 17 11:17:03.052412 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.052300 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-clrnz"] Apr 17 11:17:03.055627 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.055603 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-dglc7"] Apr 17 11:17:03.055774 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.055756 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.059627 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.058529 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sfwkf\"" Apr 17 11:17:03.059627 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.058904 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:17:03.059627 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.059001 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:17:03.059627 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.059325 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:17:03.061375 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.061344 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.063857 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.063834 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 11:17:03.064121 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.064061 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 11:17:03.064398 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.064376 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ptgzd\"" Apr 17 11:17:03.064398 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.064388 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:17:03.071293 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.071274 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-dglc7"] Apr 17 11:17:03.151911 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.151880 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-root\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152089 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.151937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152089 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152016 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.152089 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152060 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aa6316d-30a4-4962-abd4-b78bcc544001-metrics-client-ca\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152243 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.152243 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.152243 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.152243 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxgcw\" (UniqueName: \"kubernetes.io/projected/6aa6316d-30a4-4962-abd4-b78bcc544001-kube-api-access-pxgcw\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152243 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-sys\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152438 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.152438 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152328 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwf9\" (UniqueName: \"kubernetes.io/projected/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-kube-api-access-zbwf9\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.152438 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.152438 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152386 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-accelerators-collector-config\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152611 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152465 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-textfile\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152611 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152500 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.152611 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-wtmp\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152611 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-tls\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.152611 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.152611 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.152597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjz9\" (UniqueName: \"kubernetes.io/projected/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-api-access-svjz9\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253002 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.252931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-textfile\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253002 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.252979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253161 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-wtmp\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253161 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-tls\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253161 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.253161 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svjz9\" (UniqueName: \"kubernetes.io/projected/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-api-access-svjz9\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253161 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:17:03.253109 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 11:17:03.253161 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-root\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:17:03.253188 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-tls podName:4ab495b3-f50d-4e6d-b409-0e248fd3eb5d nodeName:}" failed. No retries permitted until 2026-04-17 11:17:03.753167808 +0000 UTC m=+39.884944494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-dglc7" (UID: "4ab495b3-f50d-4e6d-b409-0e248fd3eb5d") : secret "kube-state-metrics-tls" not found Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aa6316d-30a4-4962-abd4-b78bcc544001-metrics-client-ca\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253308 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxgcw\" (UniqueName: \"kubernetes.io/projected/6aa6316d-30a4-4962-abd4-b78bcc544001-kube-api-access-pxgcw\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-sys\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.253934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwf9\" (UniqueName: \"kubernetes.io/projected/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-kube-api-access-zbwf9\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.253934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.253934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253539 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-wtmp\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-accelerators-collector-config\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.253934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.254183 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.254183 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-accelerators-collector-config\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.254183 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254155 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-root\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.254183 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aa6316d-30a4-4962-abd4-b78bcc544001-sys\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.254436 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.254486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.253281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-textfile\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.254763 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aa6316d-30a4-4962-abd4-b78bcc544001-metrics-client-ca\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.254860 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.254768 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.255673 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.255646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-tls\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.256567 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.256548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.257066 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.257042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aa6316d-30a4-4962-abd4-b78bcc544001-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.257163 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.257115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.258406 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.258386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.262741 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.262703 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjz9\" (UniqueName: \"kubernetes.io/projected/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-api-access-svjz9\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.262897 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.262878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwf9\" (UniqueName: \"kubernetes.io/projected/21ac4b07-6a87-445d-bdfe-0ea94f7f73eb-kube-api-access-zbwf9\") pod \"openshift-state-metrics-9d44df66c-qlcxv\" (UID: \"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.262943 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.262930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxgcw\" (UniqueName: \"kubernetes.io/projected/6aa6316d-30a4-4962-abd4-b78bcc544001-kube-api-access-pxgcw\") pod \"node-exporter-clrnz\" (UID: \"6aa6316d-30a4-4962-abd4-b78bcc544001\") " pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.345388 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.345312 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" Apr 17 11:17:03.370275 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.370246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-clrnz" Apr 17 11:17:03.377672 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:03.377638 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa6316d_30a4_4962_abd4_b78bcc544001.slice/crio-2adc88ad8db9ef2d2d7d5e64ecb97fc14d2cf67b7a06e18ebc8cbaf66c9a06dc WatchSource:0}: Error finding container 2adc88ad8db9ef2d2d7d5e64ecb97fc14d2cf67b7a06e18ebc8cbaf66c9a06dc: Status 404 returned error can't find the container with id 2adc88ad8db9ef2d2d7d5e64ecb97fc14d2cf67b7a06e18ebc8cbaf66c9a06dc Apr 17 11:17:03.458959 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.458928 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv"] Apr 17 11:17:03.461924 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:03.461900 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ac4b07_6a87_445d_bdfe_0ea94f7f73eb.slice/crio-debc368249b399771daaf3119e935594043bbf2aa71850dfecc3746ba23b7c56 WatchSource:0}: Error finding container debc368249b399771daaf3119e935594043bbf2aa71850dfecc3746ba23b7c56: Status 404 returned error can't find the container with id debc368249b399771daaf3119e935594043bbf2aa71850dfecc3746ba23b7c56 Apr 17 11:17:03.683625 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.683595 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-clrnz" event={"ID":"6aa6316d-30a4-4962-abd4-b78bcc544001","Type":"ContainerStarted","Data":"2adc88ad8db9ef2d2d7d5e64ecb97fc14d2cf67b7a06e18ebc8cbaf66c9a06dc"} Apr 17 11:17:03.685090 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.685066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" event={"ID":"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb","Type":"ContainerStarted","Data":"da66b114da7adfccdf3d1c078b88aab060a9dd9b3022643d625fb172a463a758"} Apr 17 11:17:03.685197 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.685097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" event={"ID":"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb","Type":"ContainerStarted","Data":"26df76d7170bad046a412d7035ab5301e2670b641e5890ab67b9d73f57cdfb66"} Apr 17 11:17:03.685197 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.685112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" event={"ID":"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb","Type":"ContainerStarted","Data":"debc368249b399771daaf3119e935594043bbf2aa71850dfecc3746ba23b7c56"} Apr 17 11:17:03.686686 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.686657 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5m4mb" event={"ID":"64efb360-ec45-436d-87b0-6cd63e034c78","Type":"ContainerStarted","Data":"f0fb451660aa424673f95263510cd01820a08fd5a6fce97b33fbea43e8a2e8bc"} Apr 17 11:17:03.704939 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.704893 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5m4mb" podStartSLOduration=1.969381144 podStartE2EDuration="4.704878035s" podCreationTimestamp="2026-04-17 11:16:59 +0000 UTC" firstStartedPulling="2026-04-17 11:17:00.480243024 +0000 UTC m=+36.612019697" lastFinishedPulling="2026-04-17 11:17:03.215739924 +0000 UTC m=+39.347516588" observedRunningTime="2026-04-17 11:17:03.704372168 +0000 UTC m=+39.836148855" watchObservedRunningTime="2026-04-17 11:17:03.704878035 +0000 UTC m=+39.836654722" Apr 17 11:17:03.757183 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.757151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.759867 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.759844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ab495b3-f50d-4e6d-b409-0e248fd3eb5d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dglc7\" (UID: \"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:03.977540 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:03.977443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" Apr 17 11:17:04.091087 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.090559 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:17:04.116658 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.116630 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:17:04.116813 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.116778 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.119306 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119284 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:17:04.119451 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119388 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:17:04.119451 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119395 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:17:04.119614 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119594 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:17:04.119681 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119612 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:17:04.119681 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119633 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:17:04.119848 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119836 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:17:04.119925 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.119900 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:17:04.120513 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.120489 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:17:04.120631 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.120537 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2wzbp\"" Apr 17 11:17:04.262841 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.262807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.262997 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.262863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.262997 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.262897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36b27e94-81e6-4770-b057-9405a36d62a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.262997 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.262927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36b27e94-81e6-4770-b057-9405a36d62a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.262997 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.262964 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36b27e94-81e6-4770-b057-9405a36d62a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.262997 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.262988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-web-config\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4bnh\" (UniqueName: \"kubernetes.io/projected/36b27e94-81e6-4770-b057-9405a36d62a7-kube-api-access-d4bnh\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36b27e94-81e6-4770-b057-9405a36d62a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36b27e94-81e6-4770-b057-9405a36d62a7-config-out\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.263280 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.263184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.298694 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.298664 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-dglc7"] Apr 17 11:17:04.363811 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.363772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4bnh\" (UniqueName: \"kubernetes.io/projected/36b27e94-81e6-4770-b057-9405a36d62a7-kube-api-access-d4bnh\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.363966 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.363913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.363966 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.363948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.364090 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.363972 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36b27e94-81e6-4770-b057-9405a36d62a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.364090 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.363992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36b27e94-81e6-4770-b057-9405a36d62a7-config-out\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.364090 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.364090 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.364309 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365033 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36b27e94-81e6-4770-b057-9405a36d62a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365033 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36b27e94-81e6-4770-b057-9405a36d62a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365033 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36b27e94-81e6-4770-b057-9405a36d62a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365033 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-web-config\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365033 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.364613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365683 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.365659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36b27e94-81e6-4770-b057-9405a36d62a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.365961 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.365938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36b27e94-81e6-4770-b057-9405a36d62a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.366192 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.366172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36b27e94-81e6-4770-b057-9405a36d62a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.366858 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.366796 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36b27e94-81e6-4770-b057-9405a36d62a7-config-out\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.367162 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.366961 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.367396 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.367350 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.367506 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.367470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.367583 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.367559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.367792 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.367772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36b27e94-81e6-4770-b057-9405a36d62a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.367792 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.367782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.368623 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.368600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-web-config\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.368851 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.368832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36b27e94-81e6-4770-b057-9405a36d62a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.372003 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:04.371968 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ab495b3_f50d_4e6d_b409_0e248fd3eb5d.slice/crio-43237788a84c3094093654352d93624b8b49b0504a2894ab3f36d1d276f3bcb4 WatchSource:0}: Error finding container 43237788a84c3094093654352d93624b8b49b0504a2894ab3f36d1d276f3bcb4: Status 404 returned error can't find the container with id 43237788a84c3094093654352d93624b8b49b0504a2894ab3f36d1d276f3bcb4 Apr 17 11:17:04.373043 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.373024 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4bnh\" (UniqueName: \"kubernetes.io/projected/36b27e94-81e6-4770-b057-9405a36d62a7-kube-api-access-d4bnh\") pod \"alertmanager-main-0\" (UID: \"36b27e94-81e6-4770-b057-9405a36d62a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.427661 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.427623 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:17:04.691074 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.691037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" event={"ID":"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d","Type":"ContainerStarted","Data":"43237788a84c3094093654352d93624b8b49b0504a2894ab3f36d1d276f3bcb4"} Apr 17 11:17:04.786500 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:04.786476 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:17:04.788560 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:04.788536 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b27e94_81e6_4770_b057_9405a36d62a7.slice/crio-51c087a2eeba1e11296b41e86af815edae01ac30726c67a3b4c0466c01961b07 WatchSource:0}: Error finding container 51c087a2eeba1e11296b41e86af815edae01ac30726c67a3b4c0466c01961b07: Status 404 returned error can't find the container with id 51c087a2eeba1e11296b41e86af815edae01ac30726c67a3b4c0466c01961b07 Apr 17 11:17:05.695410 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:05.695368 2571 generic.go:358] "Generic (PLEG): container finished" podID="6aa6316d-30a4-4962-abd4-b78bcc544001" containerID="89e553776173d484c4d23aac964d622ab085831253ddaaf40ccfa1fbf668a0f7" exitCode=0 Apr 17 11:17:05.695860 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:05.695462 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-clrnz" event={"ID":"6aa6316d-30a4-4962-abd4-b78bcc544001","Type":"ContainerDied","Data":"89e553776173d484c4d23aac964d622ab085831253ddaaf40ccfa1fbf668a0f7"} Apr 17 11:17:05.696770 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:05.696741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"51c087a2eeba1e11296b41e86af815edae01ac30726c67a3b4c0466c01961b07"} Apr 17 11:17:05.698775 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:05.698749 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" event={"ID":"21ac4b07-6a87-445d-bdfe-0ea94f7f73eb","Type":"ContainerStarted","Data":"446ee881c3fe6cb9867795a040185c08ca52b3ed1790a21d015e73e39b5229ce"} Apr 17 11:17:05.734752 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:05.734706 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qlcxv" podStartSLOduration=1.388337059 podStartE2EDuration="2.734691607s" podCreationTimestamp="2026-04-17 11:17:03 +0000 UTC" firstStartedPulling="2026-04-17 11:17:03.577736859 +0000 UTC m=+39.709513524" lastFinishedPulling="2026-04-17 11:17:04.924091407 +0000 UTC m=+41.055868072" observedRunningTime="2026-04-17 11:17:05.73399883 +0000 UTC m=+41.865775518" watchObservedRunningTime="2026-04-17 11:17:05.734691607 +0000 UTC m=+41.866468293" Apr 17 11:17:06.702839 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:06.702811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-clrnz" event={"ID":"6aa6316d-30a4-4962-abd4-b78bcc544001","Type":"ContainerStarted","Data":"60251cae0ee9bd9df0b3122e8bb5be28dd2932d90cb165bafefcd080f96e6f34"} Apr 17 11:17:07.708105 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.708069 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" event={"ID":"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d","Type":"ContainerStarted","Data":"05f2ef9c123ad95f2ff6b9a0f7ebddc45b5c1e4c6ff586440553bfec3d65fb7a"} Apr 17 11:17:07.708105 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.708108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" event={"ID":"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d","Type":"ContainerStarted","Data":"8eb71ecc29ad2187f78a50c33e59c736a45aed0df691b0f1ad8dc356879bb0c4"} Apr 17 11:17:07.708637 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.708125 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" event={"ID":"4ab495b3-f50d-4e6d-b409-0e248fd3eb5d","Type":"ContainerStarted","Data":"1826708863c2acac129aabfc0a16ff8ce3a74640fd2e5e5aa7ec3d0c04a66f89"} Apr 17 11:17:07.709883 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.709847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-clrnz" event={"ID":"6aa6316d-30a4-4962-abd4-b78bcc544001","Type":"ContainerStarted","Data":"23f69baefb1a1d7c0e96c925b83e1213ad9b6d78bfb84a9657389a76f9999e74"} Apr 17 11:17:07.711013 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.710993 2571 generic.go:358] "Generic (PLEG): container finished" podID="36b27e94-81e6-4770-b057-9405a36d62a7" containerID="805dcdf9a3bf2115e4d3ad0277fe46af8efc86cc15361a864df00b80b6f9c9c6" exitCode=0 Apr 17 11:17:07.711110 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.711035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerDied","Data":"805dcdf9a3bf2115e4d3ad0277fe46af8efc86cc15361a864df00b80b6f9c9c6"} Apr 17 11:17:07.727829 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.727793 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-dglc7" podStartSLOduration=2.54139378 podStartE2EDuration="4.727781924s" podCreationTimestamp="2026-04-17 11:17:03 +0000 UTC" firstStartedPulling="2026-04-17 11:17:04.392893205 +0000 UTC m=+40.524669870" lastFinishedPulling="2026-04-17 11:17:06.579281345 +0000 UTC m=+42.711058014" observedRunningTime="2026-04-17 11:17:07.726528036 +0000 UTC m=+43.858304725" watchObservedRunningTime="2026-04-17 11:17:07.727781924 +0000 UTC m=+43.859558612" Apr 17 11:17:07.792258 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.792185 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-clrnz" podStartSLOduration=3.775642906 podStartE2EDuration="4.792170045s" podCreationTimestamp="2026-04-17 11:17:03 +0000 UTC" firstStartedPulling="2026-04-17 11:17:03.379164989 +0000 UTC m=+39.510941655" lastFinishedPulling="2026-04-17 11:17:04.395692126 +0000 UTC m=+40.527468794" observedRunningTime="2026-04-17 11:17:07.791673298 +0000 UTC m=+43.923449986" watchObservedRunningTime="2026-04-17 11:17:07.792170045 +0000 UTC m=+43.923946798" Apr 17 11:17:07.823326 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.823297 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx"] Apr 17 11:17:07.842495 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.842424 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx"] Apr 17 11:17:07.842643 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.842533 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:07.844872 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.844841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lsp57\"" Apr 17 11:17:07.844991 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.844894 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 11:17:07.995248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:07.995203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cd93e96-9968-4a9d-99fc-66831dcd7f1f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mprdx\" (UID: \"5cd93e96-9968-4a9d-99fc-66831dcd7f1f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:08.062403 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.062371 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:17:08.062573 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.062499 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:17:08.062958 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.062937 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wk5dr" Apr 17 11:17:08.096418 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.096356 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cd93e96-9968-4a9d-99fc-66831dcd7f1f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mprdx\" (UID: \"5cd93e96-9968-4a9d-99fc-66831dcd7f1f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:08.098630 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.098611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cd93e96-9968-4a9d-99fc-66831dcd7f1f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mprdx\" (UID: \"5cd93e96-9968-4a9d-99fc-66831dcd7f1f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:08.151322 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.151297 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:08.271663 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.271634 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx"] Apr 17 11:17:08.275613 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:08.275583 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd93e96_9968_4a9d_99fc_66831dcd7f1f.slice/crio-c8ac3beaa889446b807954608043ebd3b72688e8a4b54bd8f3c4afa428f7b2f8 WatchSource:0}: Error finding container c8ac3beaa889446b807954608043ebd3b72688e8a4b54bd8f3c4afa428f7b2f8: Status 404 returned error can't find the container with id c8ac3beaa889446b807954608043ebd3b72688e8a4b54bd8f3c4afa428f7b2f8 Apr 17 11:17:08.718652 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:08.718614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" event={"ID":"5cd93e96-9968-4a9d-99fc-66831dcd7f1f","Type":"ContainerStarted","Data":"c8ac3beaa889446b807954608043ebd3b72688e8a4b54bd8f3c4afa428f7b2f8"} Apr 17 11:17:09.314682 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.314649 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:17:09.348008 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.347969 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:17:09.348177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.348162 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.350829 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.350806 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:17:09.351044 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.350950 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:17:09.351044 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.350952 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:17:09.351044 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.350983 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-oqoqg5q0kurd\"" Apr 17 11:17:09.351044 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351033 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:17:09.351044 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351044 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:17:09.351922 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351835 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hlm5z\"" Apr 17 11:17:09.351922 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351904 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:17:09.351922 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351919 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:17:09.352130 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351931 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:17:09.352130 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351839 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:17:09.352130 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.351936 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:17:09.352294 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.352281 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:17:09.357463 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.357442 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:17:09.364956 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.364937 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:17:09.509032 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.508997 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-web-config\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509204 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509204 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90e4e187-064b-4359-8536-37e5cfdf4231-config-out\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509204 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509204 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509200 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509331 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509355 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509713 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509453 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90e4e187-064b-4359-8536-37e5cfdf4231-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509713 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509713 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509713 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509554 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509713 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q6l6\" (UniqueName: \"kubernetes.io/projected/90e4e187-064b-4359-8536-37e5cfdf4231-kube-api-access-8q6l6\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.509713 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.509634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-config\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610397 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610397 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610617 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610617 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610724 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90e4e187-064b-4359-8536-37e5cfdf4231-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610724 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610724 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610724 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q6l6\" (UniqueName: \"kubernetes.io/projected/90e4e187-064b-4359-8536-37e5cfdf4231-kube-api-access-8q6l6\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-config\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610800 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-web-config\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90e4e187-064b-4359-8536-37e5cfdf4231-config-out\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.610938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.611248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.611248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.610981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.611248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.611006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.611248 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.611033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.611466 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.611375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.613799 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.613775 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614037 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614128 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614998 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614466 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614998 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90e4e187-064b-4359-8536-37e5cfdf4231-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614998 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90e4e187-064b-4359-8536-37e5cfdf4231-config-out\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614998 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.614998 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.614891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90e4e187-064b-4359-8536-37e5cfdf4231-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.615350 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.615254 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.615708 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.615684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.616688 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.616648 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.617435 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.617397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.617652 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.617635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-web-config\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.617922 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.617887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.618017 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.617996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-config\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.618069 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.618030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90e4e187-064b-4359-8536-37e5cfdf4231-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.622878 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.622860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q6l6\" (UniqueName: \"kubernetes.io/projected/90e4e187-064b-4359-8536-37e5cfdf4231-kube-api-access-8q6l6\") pod \"prometheus-k8s-0\" (UID: \"90e4e187-064b-4359-8536-37e5cfdf4231\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:09.657991 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:09.657963 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:10.187169 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.187039 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:17:10.189972 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:10.189916 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e4e187_064b_4359_8536_37e5cfdf4231.slice/crio-cdced735971d43dfe281025a60d037f8404f1a2307162f67dcfcb4a6db02a875 WatchSource:0}: Error finding container cdced735971d43dfe281025a60d037f8404f1a2307162f67dcfcb4a6db02a875: Status 404 returned error can't find the container with id cdced735971d43dfe281025a60d037f8404f1a2307162f67dcfcb4a6db02a875 Apr 17 11:17:10.724892 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.724810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" event={"ID":"5cd93e96-9968-4a9d-99fc-66831dcd7f1f","Type":"ContainerStarted","Data":"e71701c22b144f552e59b1a61d7e159578060c9a1f232e9a59b7b27244c312e6"} Apr 17 11:17:10.725231 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.725191 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:10.726435 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.726407 2571 generic.go:358] "Generic (PLEG): container finished" podID="90e4e187-064b-4359-8536-37e5cfdf4231" containerID="e6d0a82235b8cae75af8246db64b3767626521d6ef15a710e145c6a3646ba1ae" exitCode=0 Apr 17 11:17:10.726556 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.726491 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerDied","Data":"e6d0a82235b8cae75af8246db64b3767626521d6ef15a710e145c6a3646ba1ae"} Apr 17 11:17:10.726556 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.726517 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"cdced735971d43dfe281025a60d037f8404f1a2307162f67dcfcb4a6db02a875"} Apr 17 11:17:10.729703 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.729682 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"b9ab835a4dd41bb2ead23189a3151c41f1c93d30effd0fe75a22b4c73a392023"} Apr 17 11:17:10.729781 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.729712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"fe52d9076729b7d1c467f0cb8256b503680f283adaceb1079959f5b77f60e3d9"} Apr 17 11:17:10.729781 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.729726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"99dc0652c78807c90be592baa63dca6d51cad9621d1284c2d5b3f503573efc3f"} Apr 17 11:17:10.729781 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.729741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"fca10735754a7b2ed5f703d7304be3bc1c0502c8bf0a9b61d5b8ec095f7e6cdc"} Apr 17 11:17:10.729781 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.729753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"cb57496672637285982b95329918770c81934a07fee6daa3019ccab088a78bee"} Apr 17 11:17:10.730584 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.730565 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" Apr 17 11:17:10.772835 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:10.772787 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mprdx" podStartSLOduration=1.99256871 podStartE2EDuration="3.77277473s" podCreationTimestamp="2026-04-17 11:17:07 +0000 UTC" firstStartedPulling="2026-04-17 11:17:08.278118109 +0000 UTC m=+44.409894774" lastFinishedPulling="2026-04-17 11:17:10.058324129 +0000 UTC m=+46.190100794" observedRunningTime="2026-04-17 11:17:10.741481046 +0000 UTC m=+46.873257733" watchObservedRunningTime="2026-04-17 11:17:10.77277473 +0000 UTC m=+46.904551417" Apr 17 11:17:12.689364 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:12.689330 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7t99h" Apr 17 11:17:12.740087 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:12.739977 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36b27e94-81e6-4770-b057-9405a36d62a7","Type":"ContainerStarted","Data":"06cefadae8ce05ad54009925aff04ad1295995066d755760bb32b652fce6b359"} Apr 17 11:17:12.770492 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:12.770440 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.626537974 podStartE2EDuration="8.770423422s" podCreationTimestamp="2026-04-17 11:17:04 +0000 UTC" firstStartedPulling="2026-04-17 11:17:04.790473644 +0000 UTC m=+40.922250309" lastFinishedPulling="2026-04-17 11:17:11.934359092 +0000 UTC m=+48.066135757" observedRunningTime="2026-04-17 11:17:12.768082468 +0000 UTC m=+48.899859191" watchObservedRunningTime="2026-04-17 11:17:12.770423422 +0000 UTC m=+48.902200110" Apr 17 11:17:14.747940 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:14.747900 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"bf89d3b6d6d570b6a81f12bbf44ef0110ac29116dab2d621439353d26403b243"} Apr 17 11:17:14.747940 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:14.747938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"519846d069122b0444c4746af33c627edaa6b8bf2f47884c1b9c6190e9a6ade8"} Apr 17 11:17:16.623941 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:16.623672 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nj28v" Apr 17 11:17:16.757242 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:16.757192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"c43e300735bff25af151e9acf8a472d3756ac8f4379716c19f0a71483c46bf37"} Apr 17 11:17:16.757242 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:16.757240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"03d26401d4c11da9483112cb77bb5d5fea053a576aa2f9e7cf8c0f837a64b9c7"} Apr 17 11:17:16.757242 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:16.757250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"656ae9aa658776c9f304d49081c61b4d88e9947f72efc6104c0538f2ac000960"} Apr 17 11:17:16.757529 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:16.757261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90e4e187-064b-4359-8536-37e5cfdf4231","Type":"ContainerStarted","Data":"5a46eacfdcdf844d28cfe1cbc93fe632b7aa7b8cda9cbdecfec3dc3975206242"} Apr 17 11:17:16.793821 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:16.792916 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.200519501 podStartE2EDuration="7.792896787s" podCreationTimestamp="2026-04-17 11:17:09 +0000 UTC" firstStartedPulling="2026-04-17 11:17:10.727802633 +0000 UTC m=+46.859579303" lastFinishedPulling="2026-04-17 11:17:16.320179921 +0000 UTC m=+52.451956589" observedRunningTime="2026-04-17 11:17:16.790188624 +0000 UTC m=+52.921965308" watchObservedRunningTime="2026-04-17 11:17:16.792896787 +0000 UTC m=+52.924673480" Apr 17 11:17:19.658786 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:19.658751 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:29.180450 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.180411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:17:29.183417 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.183400 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:29.192853 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.192831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1c44fb-0716-4e07-9409-72264a348f29-metrics-certs\") pod \"network-metrics-daemon-fcgzf\" (UID: \"1b1c44fb-0716-4e07-9409-72264a348f29\") " pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:17:29.281448 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.281407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:17:29.284775 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.284757 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:29.294443 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.294428 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:29.304809 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.304781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmbfl\" (UniqueName: \"kubernetes.io/projected/97b2b03f-63f9-420a-8e36-4e191f507077-kube-api-access-dmbfl\") pod \"network-check-target-82xvr\" (UID: \"97b2b03f-63f9-420a-8e36-4e191f507077\") " pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:17:29.417547 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.417519 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-98wmj\"" Apr 17 11:17:29.422140 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.422120 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x2vcd\"" Apr 17 11:17:29.425208 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.425194 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcgzf" Apr 17 11:17:29.430945 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.430885 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:17:29.550292 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.550263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fcgzf"] Apr 17 11:17:29.553741 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:29.553714 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1c44fb_0716_4e07_9409_72264a348f29.slice/crio-3860890a3098e242ac5c31433f2d5010bbd9dbae6debe3e2f0e34a4d61a0e3a8 WatchSource:0}: Error finding container 3860890a3098e242ac5c31433f2d5010bbd9dbae6debe3e2f0e34a4d61a0e3a8: Status 404 returned error can't find the container with id 3860890a3098e242ac5c31433f2d5010bbd9dbae6debe3e2f0e34a4d61a0e3a8 Apr 17 11:17:29.571931 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.571905 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-82xvr"] Apr 17 11:17:29.574703 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:17:29.574680 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b2b03f_63f9_420a_8e36_4e191f507077.slice/crio-21a36edd88ce7a6b0ccf0fde93c22a0908a9a1ecdf731ed1beece235e5e7e853 WatchSource:0}: Error finding container 21a36edd88ce7a6b0ccf0fde93c22a0908a9a1ecdf731ed1beece235e5e7e853: Status 404 returned error can't find the container with id 21a36edd88ce7a6b0ccf0fde93c22a0908a9a1ecdf731ed1beece235e5e7e853 Apr 17 11:17:29.796775 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.796678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-82xvr" event={"ID":"97b2b03f-63f9-420a-8e36-4e191f507077","Type":"ContainerStarted","Data":"21a36edd88ce7a6b0ccf0fde93c22a0908a9a1ecdf731ed1beece235e5e7e853"} Apr 17 11:17:29.797756 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:29.797737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fcgzf" event={"ID":"1b1c44fb-0716-4e07-9409-72264a348f29","Type":"ContainerStarted","Data":"3860890a3098e242ac5c31433f2d5010bbd9dbae6debe3e2f0e34a4d61a0e3a8"} Apr 17 11:17:31.806188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:31.806144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fcgzf" event={"ID":"1b1c44fb-0716-4e07-9409-72264a348f29","Type":"ContainerStarted","Data":"acd99d561373da377c5b0fdb4cae501c243e0836ebc9bdcf1f052b8c14686f1c"} Apr 17 11:17:31.806188 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:31.806188 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fcgzf" event={"ID":"1b1c44fb-0716-4e07-9409-72264a348f29","Type":"ContainerStarted","Data":"847e1671b9d7fda4b61764a8c1b1f7f17ce318ffc96b9bf0747d653a424d9fd7"} Apr 17 11:17:33.813010 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:33.812979 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-82xvr" event={"ID":"97b2b03f-63f9-420a-8e36-4e191f507077","Type":"ContainerStarted","Data":"b70ba62ca016db9eb90da03c0b7d169659d93e9a165ad6a9607980363d0ed5c5"} Apr 17 11:17:33.813419 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:33.813084 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:17:33.829175 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:33.829131 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fcgzf" podStartSLOduration=68.573700991 podStartE2EDuration="1m9.829119524s" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="2026-04-17 11:17:29.555512821 +0000 UTC m=+65.687289501" lastFinishedPulling="2026-04-17 11:17:30.810931369 +0000 UTC m=+66.942708034" observedRunningTime="2026-04-17 11:17:31.822515845 +0000 UTC m=+67.954292542" watchObservedRunningTime="2026-04-17 11:17:33.829119524 +0000 UTC m=+69.960896210" Apr 17 11:17:33.829311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:33.829228 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-82xvr" podStartSLOduration=66.683267059 podStartE2EDuration="1m9.829207855s" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="2026-04-17 11:17:29.576630173 +0000 UTC m=+65.708406838" lastFinishedPulling="2026-04-17 11:17:32.722570969 +0000 UTC m=+68.854347634" observedRunningTime="2026-04-17 11:17:33.828149807 +0000 UTC m=+69.959926506" watchObservedRunningTime="2026-04-17 11:17:33.829207855 +0000 UTC m=+69.960984542" Apr 17 11:17:50.379810 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:50.379774 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:50.407588 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:50.407560 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:17:50.881434 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:17:50.881395 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:18:04.818911 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:04.818880 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-82xvr" Apr 17 11:18:25.578105 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.578066 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-54fdx"] Apr 17 11:18:25.583117 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.583099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.586243 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.586207 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:18:25.592361 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.592340 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-54fdx"] Apr 17 11:18:25.622304 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.622268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9c775e3-9dbe-4167-ae13-e921d68e51d8-original-pull-secret\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.622304 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.622309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d9c775e3-9dbe-4167-ae13-e921d68e51d8-dbus\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.622545 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.622329 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d9c775e3-9dbe-4167-ae13-e921d68e51d8-kubelet-config\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.723566 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.723526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9c775e3-9dbe-4167-ae13-e921d68e51d8-original-pull-secret\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.723708 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.723573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d9c775e3-9dbe-4167-ae13-e921d68e51d8-dbus\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.723708 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.723605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d9c775e3-9dbe-4167-ae13-e921d68e51d8-kubelet-config\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.723773 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.723732 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d9c775e3-9dbe-4167-ae13-e921d68e51d8-kubelet-config\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.723811 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.723785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d9c775e3-9dbe-4167-ae13-e921d68e51d8-dbus\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.725875 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.725848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9c775e3-9dbe-4167-ae13-e921d68e51d8-original-pull-secret\") pod \"global-pull-secret-syncer-54fdx\" (UID: \"d9c775e3-9dbe-4167-ae13-e921d68e51d8\") " pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:25.891860 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:25.891785 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-54fdx" Apr 17 11:18:26.008407 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:26.008373 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-54fdx"] Apr 17 11:18:26.012006 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:18:26.011978 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c775e3_9dbe_4167_ae13_e921d68e51d8.slice/crio-c5ffa6a52a748bfc611de387c570588a0f680c96545be826bf4989ef4f601472 WatchSource:0}: Error finding container c5ffa6a52a748bfc611de387c570588a0f680c96545be826bf4989ef4f601472: Status 404 returned error can't find the container with id c5ffa6a52a748bfc611de387c570588a0f680c96545be826bf4989ef4f601472 Apr 17 11:18:26.959761 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:26.959708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-54fdx" event={"ID":"d9c775e3-9dbe-4167-ae13-e921d68e51d8","Type":"ContainerStarted","Data":"c5ffa6a52a748bfc611de387c570588a0f680c96545be826bf4989ef4f601472"} Apr 17 11:18:30.973356 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:30.973321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-54fdx" event={"ID":"d9c775e3-9dbe-4167-ae13-e921d68e51d8","Type":"ContainerStarted","Data":"701fcbac03882d73cbb85b3d1f348b56b68f627dd6ec678ee330a09795b2de6a"} Apr 17 11:18:30.990984 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:18:30.990937 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-54fdx" podStartSLOduration=1.912394532 podStartE2EDuration="5.990922392s" podCreationTimestamp="2026-04-17 11:18:25 +0000 UTC" firstStartedPulling="2026-04-17 11:18:26.013734192 +0000 UTC m=+122.145510857" lastFinishedPulling="2026-04-17 11:18:30.092262052 +0000 UTC m=+126.224038717" observedRunningTime="2026-04-17 11:18:30.989461828 +0000 UTC m=+127.121238516" watchObservedRunningTime="2026-04-17 11:18:30.990922392 +0000 UTC m=+127.122699082" Apr 17 11:21:24.389227 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:24.389187 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/1.log" Apr 17 11:21:24.389703 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:24.389619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal_c05921670fca4842dd48b5deb56ad8b1/kube-rbac-proxy-crio/1.log" Apr 17 11:21:24.391488 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:24.391472 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:28.484965 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.484929 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-76srm"] Apr 17 11:21:28.487991 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.487975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.490597 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.490575 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:21:28.490699 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.490575 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 11:21:28.491578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.491560 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mdxbb\"" Apr 17 11:21:28.491578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.491575 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:21:28.497139 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.497115 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-76srm"] Apr 17 11:21:28.518470 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.518448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1a135322-5f7f-4fea-a693-49ac4d55f176-data\") pod \"seaweedfs-86cc847c5c-76srm\" (UID: \"1a135322-5f7f-4fea-a693-49ac4d55f176\") " pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.518593 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.518483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x6d6\" (UniqueName: \"kubernetes.io/projected/1a135322-5f7f-4fea-a693-49ac4d55f176-kube-api-access-2x6d6\") pod \"seaweedfs-86cc847c5c-76srm\" (UID: \"1a135322-5f7f-4fea-a693-49ac4d55f176\") " pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.619048 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.619018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1a135322-5f7f-4fea-a693-49ac4d55f176-data\") pod \"seaweedfs-86cc847c5c-76srm\" (UID: \"1a135322-5f7f-4fea-a693-49ac4d55f176\") " pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.619197 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.619060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x6d6\" (UniqueName: \"kubernetes.io/projected/1a135322-5f7f-4fea-a693-49ac4d55f176-kube-api-access-2x6d6\") pod \"seaweedfs-86cc847c5c-76srm\" (UID: \"1a135322-5f7f-4fea-a693-49ac4d55f176\") " pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.619435 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.619418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1a135322-5f7f-4fea-a693-49ac4d55f176-data\") pod \"seaweedfs-86cc847c5c-76srm\" (UID: \"1a135322-5f7f-4fea-a693-49ac4d55f176\") " pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.629190 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.629162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x6d6\" (UniqueName: \"kubernetes.io/projected/1a135322-5f7f-4fea-a693-49ac4d55f176-kube-api-access-2x6d6\") pod \"seaweedfs-86cc847c5c-76srm\" (UID: \"1a135322-5f7f-4fea-a693-49ac4d55f176\") " pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.798307 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.798206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:28.916983 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.916949 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-76srm"] Apr 17 11:21:28.920615 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:21:28.920587 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a135322_5f7f_4fea_a693_49ac4d55f176.slice/crio-e4d16714c1081426f67ea92427dd8f840426e38f94573e663d8849b248307007 WatchSource:0}: Error finding container e4d16714c1081426f67ea92427dd8f840426e38f94573e663d8849b248307007: Status 404 returned error can't find the container with id e4d16714c1081426f67ea92427dd8f840426e38f94573e663d8849b248307007 Apr 17 11:21:28.921835 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:28.921816 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:21:29.450010 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:29.449970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-76srm" event={"ID":"1a135322-5f7f-4fea-a693-49ac4d55f176","Type":"ContainerStarted","Data":"e4d16714c1081426f67ea92427dd8f840426e38f94573e663d8849b248307007"} Apr 17 11:21:33.462918 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:33.462879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-76srm" event={"ID":"1a135322-5f7f-4fea-a693-49ac4d55f176","Type":"ContainerStarted","Data":"4d39660a716ee8f8ca7f0c6f0d03f8fd58770efc7e173606468a9eea22ad4a4f"} Apr 17 11:21:33.463329 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:33.463005 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:21:33.479850 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:33.479805 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-76srm" podStartSLOduration=1.827279675 podStartE2EDuration="5.479790937s" podCreationTimestamp="2026-04-17 11:21:28 +0000 UTC" firstStartedPulling="2026-04-17 11:21:28.921965122 +0000 UTC m=+305.053741786" lastFinishedPulling="2026-04-17 11:21:32.57447637 +0000 UTC m=+308.706253048" observedRunningTime="2026-04-17 11:21:33.478707834 +0000 UTC m=+309.610484521" watchObservedRunningTime="2026-04-17 11:21:33.479790937 +0000 UTC m=+309.611567624" Apr 17 11:21:39.468289 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:21:39.468261 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-76srm" Apr 17 11:22:37.874170 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.874126 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-rm476"] Apr 17 11:22:37.877445 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.877425 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:37.879833 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.879815 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 11:22:37.879956 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.879940 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-mssnn\"" Apr 17 11:22:37.886382 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.886364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-rm476"] Apr 17 11:22:37.993015 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.992976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ba10494-e1c2-44de-854e-2011424b1a9b-cert\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:37.993177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:37.993023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t95l\" (UniqueName: \"kubernetes.io/projected/4ba10494-e1c2-44de-854e-2011424b1a9b-kube-api-access-2t95l\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.093660 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.093627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ba10494-e1c2-44de-854e-2011424b1a9b-cert\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.093840 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.093669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t95l\" (UniqueName: \"kubernetes.io/projected/4ba10494-e1c2-44de-854e-2011424b1a9b-kube-api-access-2t95l\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.093840 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:22:38.093776 2571 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 11:22:38.093928 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:22:38.093842 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba10494-e1c2-44de-854e-2011424b1a9b-cert podName:4ba10494-e1c2-44de-854e-2011424b1a9b nodeName:}" failed. No retries permitted until 2026-04-17 11:22:38.593822034 +0000 UTC m=+374.725598710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4ba10494-e1c2-44de-854e-2011424b1a9b-cert") pod "odh-model-controller-696fc77849-rm476" (UID: "4ba10494-e1c2-44de-854e-2011424b1a9b") : secret "odh-model-controller-webhook-cert" not found Apr 17 11:22:38.104891 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.104866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t95l\" (UniqueName: \"kubernetes.io/projected/4ba10494-e1c2-44de-854e-2011424b1a9b-kube-api-access-2t95l\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.598346 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.598311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ba10494-e1c2-44de-854e-2011424b1a9b-cert\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.600701 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.600679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ba10494-e1c2-44de-854e-2011424b1a9b-cert\") pod \"odh-model-controller-696fc77849-rm476\" (UID: \"4ba10494-e1c2-44de-854e-2011424b1a9b\") " pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.788395 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.788356 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:38.903037 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:38.903012 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-rm476"] Apr 17 11:22:38.906223 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:22:38.906187 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba10494_e1c2_44de_854e_2011424b1a9b.slice/crio-b54ba66cb836cc434f821a4a897424997e57dde8bde26287995df1aa42481bf6 WatchSource:0}: Error finding container b54ba66cb836cc434f821a4a897424997e57dde8bde26287995df1aa42481bf6: Status 404 returned error can't find the container with id b54ba66cb836cc434f821a4a897424997e57dde8bde26287995df1aa42481bf6 Apr 17 11:22:39.638718 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:39.638681 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-rm476" event={"ID":"4ba10494-e1c2-44de-854e-2011424b1a9b","Type":"ContainerStarted","Data":"b54ba66cb836cc434f821a4a897424997e57dde8bde26287995df1aa42481bf6"} Apr 17 11:22:41.646520 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:41.646490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-rm476" event={"ID":"4ba10494-e1c2-44de-854e-2011424b1a9b","Type":"ContainerStarted","Data":"568e232d3fd98c1db2bdfe6d39d039e1184ce5b5b737ed6ca5228585db02d107"} Apr 17 11:22:41.646903 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:41.646615 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:41.664618 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:41.664550 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-rm476" podStartSLOduration=2.122130923 podStartE2EDuration="4.664497206s" podCreationTimestamp="2026-04-17 11:22:37 +0000 UTC" firstStartedPulling="2026-04-17 11:22:38.907311711 +0000 UTC m=+375.039088377" lastFinishedPulling="2026-04-17 11:22:41.449677996 +0000 UTC m=+377.581454660" observedRunningTime="2026-04-17 11:22:41.663805847 +0000 UTC m=+377.795582533" watchObservedRunningTime="2026-04-17 11:22:41.664497206 +0000 UTC m=+377.796273893" Apr 17 11:22:52.652071 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:52.652043 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-rm476" Apr 17 11:22:53.493193 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.493154 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-stvql"] Apr 17 11:22:53.499652 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.499625 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-stvql" Apr 17 11:22:53.502916 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.502892 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-stvql"] Apr 17 11:22:53.623759 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.623727 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxwq\" (UniqueName: \"kubernetes.io/projected/1b608843-b9ab-4137-af3d-1380091da4d0-kube-api-access-qkxwq\") pod \"s3-init-stvql\" (UID: \"1b608843-b9ab-4137-af3d-1380091da4d0\") " pod="kserve/s3-init-stvql" Apr 17 11:22:53.724539 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.724510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxwq\" (UniqueName: \"kubernetes.io/projected/1b608843-b9ab-4137-af3d-1380091da4d0-kube-api-access-qkxwq\") pod \"s3-init-stvql\" (UID: \"1b608843-b9ab-4137-af3d-1380091da4d0\") " pod="kserve/s3-init-stvql" Apr 17 11:22:53.734674 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.734649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxwq\" (UniqueName: \"kubernetes.io/projected/1b608843-b9ab-4137-af3d-1380091da4d0-kube-api-access-qkxwq\") pod \"s3-init-stvql\" (UID: \"1b608843-b9ab-4137-af3d-1380091da4d0\") " pod="kserve/s3-init-stvql" Apr 17 11:22:53.829916 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.829833 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-stvql" Apr 17 11:22:53.948470 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:53.948446 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-stvql"] Apr 17 11:22:53.950806 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:22:53.950773 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b608843_b9ab_4137_af3d_1380091da4d0.slice/crio-457664ee78dc4a9c50e6de35c6f51e974b131ac9ddcc2501a992fb4c681d522f WatchSource:0}: Error finding container 457664ee78dc4a9c50e6de35c6f51e974b131ac9ddcc2501a992fb4c681d522f: Status 404 returned error can't find the container with id 457664ee78dc4a9c50e6de35c6f51e974b131ac9ddcc2501a992fb4c681d522f Apr 17 11:22:54.686180 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:54.686144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-stvql" event={"ID":"1b608843-b9ab-4137-af3d-1380091da4d0","Type":"ContainerStarted","Data":"457664ee78dc4a9c50e6de35c6f51e974b131ac9ddcc2501a992fb4c681d522f"} Apr 17 11:22:58.700822 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:58.700783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-stvql" event={"ID":"1b608843-b9ab-4137-af3d-1380091da4d0","Type":"ContainerStarted","Data":"17206b5fe8d4546e9bcec6462414115409af3273f9af14c8466b554c23ac4bcf"} Apr 17 11:22:58.719793 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:22:58.719739 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-stvql" podStartSLOduration=1.243817317 podStartE2EDuration="5.719719291s" podCreationTimestamp="2026-04-17 11:22:53 +0000 UTC" firstStartedPulling="2026-04-17 11:22:53.952471087 +0000 UTC m=+390.084247752" lastFinishedPulling="2026-04-17 11:22:58.428373058 +0000 UTC m=+394.560149726" observedRunningTime="2026-04-17 11:22:58.717206492 +0000 UTC m=+394.848983212" watchObservedRunningTime="2026-04-17 11:22:58.719719291 +0000 UTC m=+394.851495983" Apr 17 11:23:01.710172 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:01.710086 2571 generic.go:358] "Generic (PLEG): container finished" podID="1b608843-b9ab-4137-af3d-1380091da4d0" containerID="17206b5fe8d4546e9bcec6462414115409af3273f9af14c8466b554c23ac4bcf" exitCode=0 Apr 17 11:23:01.710172 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:01.710157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-stvql" event={"ID":"1b608843-b9ab-4137-af3d-1380091da4d0","Type":"ContainerDied","Data":"17206b5fe8d4546e9bcec6462414115409af3273f9af14c8466b554c23ac4bcf"} Apr 17 11:23:02.835476 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:02.835450 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-stvql" Apr 17 11:23:03.006505 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:03.006420 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkxwq\" (UniqueName: \"kubernetes.io/projected/1b608843-b9ab-4137-af3d-1380091da4d0-kube-api-access-qkxwq\") pod \"1b608843-b9ab-4137-af3d-1380091da4d0\" (UID: \"1b608843-b9ab-4137-af3d-1380091da4d0\") " Apr 17 11:23:03.009189 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:03.009162 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b608843-b9ab-4137-af3d-1380091da4d0-kube-api-access-qkxwq" (OuterVolumeSpecName: "kube-api-access-qkxwq") pod "1b608843-b9ab-4137-af3d-1380091da4d0" (UID: "1b608843-b9ab-4137-af3d-1380091da4d0"). InnerVolumeSpecName "kube-api-access-qkxwq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:03.107346 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:03.107271 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkxwq\" (UniqueName: \"kubernetes.io/projected/1b608843-b9ab-4137-af3d-1380091da4d0-kube-api-access-qkxwq\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:03.716547 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:03.716506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-stvql" event={"ID":"1b608843-b9ab-4137-af3d-1380091da4d0","Type":"ContainerDied","Data":"457664ee78dc4a9c50e6de35c6f51e974b131ac9ddcc2501a992fb4c681d522f"} Apr 17 11:23:03.716547 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:03.716525 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-stvql" Apr 17 11:23:03.716547 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:03.716539 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="457664ee78dc4a9c50e6de35c6f51e974b131ac9ddcc2501a992fb4c681d522f" Apr 17 11:23:04.417772 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.417740 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24"] Apr 17 11:23:04.418123 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.418060 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b608843-b9ab-4137-af3d-1380091da4d0" containerName="s3-init" Apr 17 11:23:04.418123 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.418073 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b608843-b9ab-4137-af3d-1380091da4d0" containerName="s3-init" Apr 17 11:23:04.418191 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.418129 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b608843-b9ab-4137-af3d-1380091da4d0" containerName="s3-init" Apr 17 11:23:04.419946 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.419930 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.423314 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.423295 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 11:23:04.429180 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.429156 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24"] Apr 17 11:23:04.519079 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.519053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7m8\" (UniqueName: \"kubernetes.io/projected/f1d43dff-b34c-439b-8133-280c95ac6d71-kube-api-access-5x7m8\") pod \"seaweedfs-tls-custom-ddd4dbfd-zzb24\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.519260 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.519090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f1d43dff-b34c-439b-8133-280c95ac6d71-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-zzb24\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.619797 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.619761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7m8\" (UniqueName: \"kubernetes.io/projected/f1d43dff-b34c-439b-8133-280c95ac6d71-kube-api-access-5x7m8\") pod \"seaweedfs-tls-custom-ddd4dbfd-zzb24\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.619958 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.619826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f1d43dff-b34c-439b-8133-280c95ac6d71-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-zzb24\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.620181 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.620162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f1d43dff-b34c-439b-8133-280c95ac6d71-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-zzb24\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.633944 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.633923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7m8\" (UniqueName: \"kubernetes.io/projected/f1d43dff-b34c-439b-8133-280c95ac6d71-kube-api-access-5x7m8\") pod \"seaweedfs-tls-custom-ddd4dbfd-zzb24\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.728590 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.728558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:04.845897 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:04.845864 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24"] Apr 17 11:23:04.848712 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:04.848686 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d43dff_b34c_439b_8133_280c95ac6d71.slice/crio-45bcaa6bff6d6e6d6388f229759d48f2f25e58937bfb8541e8a5cf863f81db90 WatchSource:0}: Error finding container 45bcaa6bff6d6e6d6388f229759d48f2f25e58937bfb8541e8a5cf863f81db90: Status 404 returned error can't find the container with id 45bcaa6bff6d6e6d6388f229759d48f2f25e58937bfb8541e8a5cf863f81db90 Apr 17 11:23:05.722926 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:05.722892 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" event={"ID":"f1d43dff-b34c-439b-8133-280c95ac6d71","Type":"ContainerStarted","Data":"b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0"} Apr 17 11:23:05.722926 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:05.722927 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" event={"ID":"f1d43dff-b34c-439b-8133-280c95ac6d71","Type":"ContainerStarted","Data":"45bcaa6bff6d6e6d6388f229759d48f2f25e58937bfb8541e8a5cf863f81db90"} Apr 17 11:23:05.738675 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:05.738634 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" podStartSLOduration=1.5063689789999999 podStartE2EDuration="1.738621619s" podCreationTimestamp="2026-04-17 11:23:04 +0000 UTC" firstStartedPulling="2026-04-17 11:23:04.850449033 +0000 UTC m=+400.982225702" lastFinishedPulling="2026-04-17 11:23:05.082701675 +0000 UTC m=+401.214478342" observedRunningTime="2026-04-17 11:23:05.738000513 +0000 UTC m=+401.869777201" watchObservedRunningTime="2026-04-17 11:23:05.738621619 +0000 UTC m=+401.870398305" Apr 17 11:23:06.879671 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:06.879636 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24"] Apr 17 11:23:07.728954 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:07.728890 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" podUID="f1d43dff-b34c-439b-8133-280c95ac6d71" containerName="seaweedfs-tls-custom" containerID="cri-o://b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0" gracePeriod=30 Apr 17 11:23:08.963811 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:08.963787 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:09.052563 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.052474 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x7m8\" (UniqueName: \"kubernetes.io/projected/f1d43dff-b34c-439b-8133-280c95ac6d71-kube-api-access-5x7m8\") pod \"f1d43dff-b34c-439b-8133-280c95ac6d71\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " Apr 17 11:23:09.052563 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.052535 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f1d43dff-b34c-439b-8133-280c95ac6d71-data\") pod \"f1d43dff-b34c-439b-8133-280c95ac6d71\" (UID: \"f1d43dff-b34c-439b-8133-280c95ac6d71\") " Apr 17 11:23:09.053695 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.053668 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d43dff-b34c-439b-8133-280c95ac6d71-data" (OuterVolumeSpecName: "data") pod "f1d43dff-b34c-439b-8133-280c95ac6d71" (UID: "f1d43dff-b34c-439b-8133-280c95ac6d71"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:23:09.054562 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.054542 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d43dff-b34c-439b-8133-280c95ac6d71-kube-api-access-5x7m8" (OuterVolumeSpecName: "kube-api-access-5x7m8") pod "f1d43dff-b34c-439b-8133-280c95ac6d71" (UID: "f1d43dff-b34c-439b-8133-280c95ac6d71"). InnerVolumeSpecName "kube-api-access-5x7m8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:09.153626 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.153591 2571 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f1d43dff-b34c-439b-8133-280c95ac6d71-data\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:09.153626 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.153621 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5x7m8\" (UniqueName: \"kubernetes.io/projected/f1d43dff-b34c-439b-8133-280c95ac6d71-kube-api-access-5x7m8\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:09.735894 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.735861 2571 generic.go:358] "Generic (PLEG): container finished" podID="f1d43dff-b34c-439b-8133-280c95ac6d71" containerID="b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0" exitCode=0 Apr 17 11:23:09.736108 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.735918 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" Apr 17 11:23:09.736108 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.735949 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" event={"ID":"f1d43dff-b34c-439b-8133-280c95ac6d71","Type":"ContainerDied","Data":"b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0"} Apr 17 11:23:09.736108 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.735991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24" event={"ID":"f1d43dff-b34c-439b-8133-280c95ac6d71","Type":"ContainerDied","Data":"45bcaa6bff6d6e6d6388f229759d48f2f25e58937bfb8541e8a5cf863f81db90"} Apr 17 11:23:09.736108 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.736011 2571 scope.go:117] "RemoveContainer" containerID="b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0" Apr 17 11:23:09.745379 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.745362 2571 scope.go:117] "RemoveContainer" containerID="b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0" Apr 17 11:23:09.745645 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:23:09.745628 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0\": container with ID starting with b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0 not found: ID does not exist" containerID="b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0" Apr 17 11:23:09.745692 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.745652 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0"} err="failed to get container status \"b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0\": rpc error: code = NotFound desc = could not find container \"b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0\": container with ID starting with b9bb30380d003ada167c3dfbdaca969084c08dca41d905b014ef67fd56da6eb0 not found: ID does not exist" Apr 17 11:23:09.758397 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.758370 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24"] Apr 17 11:23:09.761792 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.761768 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zzb24"] Apr 17 11:23:09.792143 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.792116 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9"] Apr 17 11:23:09.792446 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.792428 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d43dff-b34c-439b-8133-280c95ac6d71" containerName="seaweedfs-tls-custom" Apr 17 11:23:09.792446 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.792446 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d43dff-b34c-439b-8133-280c95ac6d71" containerName="seaweedfs-tls-custom" Apr 17 11:23:09.792544 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.792513 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d43dff-b34c-439b-8133-280c95ac6d71" containerName="seaweedfs-tls-custom" Apr 17 11:23:09.795378 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.795361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.798249 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.798208 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 11:23:09.798386 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.798369 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 17 11:23:09.805811 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.805789 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9"] Apr 17 11:23:09.857344 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.857314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.857486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.857354 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.857486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.857374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whpp\" (UniqueName: \"kubernetes.io/projected/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-kube-api-access-8whpp\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.957814 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.957778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.957941 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.957833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.957941 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.957855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8whpp\" (UniqueName: \"kubernetes.io/projected/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-kube-api-access-8whpp\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.958269 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.958248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.960194 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.960176 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:09.966770 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:09.966746 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whpp\" (UniqueName: \"kubernetes.io/projected/c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0-kube-api-access-8whpp\") pod \"seaweedfs-tls-custom-5c88b85bb7-gvbp9\" (UID: \"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:10.104356 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:10.104265 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" Apr 17 11:23:10.245205 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:10.245174 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9"] Apr 17 11:23:10.248046 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:10.248019 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0af8ff0_37fe_471a_9a37_e5e8cb2cfaf0.slice/crio-2970fb6ba79b8d1ecf8c3178bc3d958805106f37d0d9e36a969703315f4b2e3c WatchSource:0}: Error finding container 2970fb6ba79b8d1ecf8c3178bc3d958805106f37d0d9e36a969703315f4b2e3c: Status 404 returned error can't find the container with id 2970fb6ba79b8d1ecf8c3178bc3d958805106f37d0d9e36a969703315f4b2e3c Apr 17 11:23:10.510546 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:10.510520 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d43dff-b34c-439b-8133-280c95ac6d71" path="/var/lib/kubelet/pods/f1d43dff-b34c-439b-8133-280c95ac6d71/volumes" Apr 17 11:23:10.740151 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:10.740116 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" event={"ID":"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0","Type":"ContainerStarted","Data":"2d4ca25e4518c390c8609cf0ea401c0e6e594edcdf178b5cb2a5e178691a3efe"} Apr 17 11:23:10.740151 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:10.740148 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" event={"ID":"c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0","Type":"ContainerStarted","Data":"2970fb6ba79b8d1ecf8c3178bc3d958805106f37d0d9e36a969703315f4b2e3c"} Apr 17 11:23:10.765073 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:10.763484 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-gvbp9" podStartSLOduration=1.508347336 podStartE2EDuration="1.763466847s" podCreationTimestamp="2026-04-17 11:23:09 +0000 UTC" firstStartedPulling="2026-04-17 11:23:10.249297245 +0000 UTC m=+406.381073910" lastFinishedPulling="2026-04-17 11:23:10.504416753 +0000 UTC m=+406.636193421" observedRunningTime="2026-04-17 11:23:10.761338259 +0000 UTC m=+406.893114944" watchObservedRunningTime="2026-04-17 11:23:10.763466847 +0000 UTC m=+406.895243536" Apr 17 11:23:11.049772 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.049691 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-7d44j"] Apr 17 11:23:11.052011 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.051990 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:11.059387 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.059364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-7d44j"] Apr 17 11:23:11.072330 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.072293 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfrt\" (UniqueName: \"kubernetes.io/projected/92b276db-ac4d-4f6b-a575-15495a488afb-kube-api-access-5vfrt\") pod \"s3-tls-init-custom-7d44j\" (UID: \"92b276db-ac4d-4f6b-a575-15495a488afb\") " pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:11.172860 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.172828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfrt\" (UniqueName: \"kubernetes.io/projected/92b276db-ac4d-4f6b-a575-15495a488afb-kube-api-access-5vfrt\") pod \"s3-tls-init-custom-7d44j\" (UID: \"92b276db-ac4d-4f6b-a575-15495a488afb\") " pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:11.182570 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.182541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfrt\" (UniqueName: \"kubernetes.io/projected/92b276db-ac4d-4f6b-a575-15495a488afb-kube-api-access-5vfrt\") pod \"s3-tls-init-custom-7d44j\" (UID: \"92b276db-ac4d-4f6b-a575-15495a488afb\") " pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:11.374085 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.373999 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:11.487883 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.487851 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-7d44j"] Apr 17 11:23:11.491432 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:11.491403 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92b276db_ac4d_4f6b_a575_15495a488afb.slice/crio-2015c3050c8a22f62d45b0c24fdaf79297be7aad0f65d8cb88da687d5e16ad80 WatchSource:0}: Error finding container 2015c3050c8a22f62d45b0c24fdaf79297be7aad0f65d8cb88da687d5e16ad80: Status 404 returned error can't find the container with id 2015c3050c8a22f62d45b0c24fdaf79297be7aad0f65d8cb88da687d5e16ad80 Apr 17 11:23:11.744000 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.743969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7d44j" event={"ID":"92b276db-ac4d-4f6b-a575-15495a488afb","Type":"ContainerStarted","Data":"222acbaaec2688b31b375b6fd98c5cb7946038ac4253c298450ddac79aa2036c"} Apr 17 11:23:11.744000 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.744004 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7d44j" event={"ID":"92b276db-ac4d-4f6b-a575-15495a488afb","Type":"ContainerStarted","Data":"2015c3050c8a22f62d45b0c24fdaf79297be7aad0f65d8cb88da687d5e16ad80"} Apr 17 11:23:11.761023 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:11.760978 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-7d44j" podStartSLOduration=0.760964847 podStartE2EDuration="760.964847ms" podCreationTimestamp="2026-04-17 11:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:23:11.759799007 +0000 UTC m=+407.891575694" watchObservedRunningTime="2026-04-17 11:23:11.760964847 +0000 UTC m=+407.892741513" Apr 17 11:23:16.758486 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:16.758455 2571 generic.go:358] "Generic (PLEG): container finished" podID="92b276db-ac4d-4f6b-a575-15495a488afb" containerID="222acbaaec2688b31b375b6fd98c5cb7946038ac4253c298450ddac79aa2036c" exitCode=0 Apr 17 11:23:16.758876 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:16.758504 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7d44j" event={"ID":"92b276db-ac4d-4f6b-a575-15495a488afb","Type":"ContainerDied","Data":"222acbaaec2688b31b375b6fd98c5cb7946038ac4253c298450ddac79aa2036c"} Apr 17 11:23:17.882592 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:17.882564 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:17.930517 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:17.930482 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vfrt\" (UniqueName: \"kubernetes.io/projected/92b276db-ac4d-4f6b-a575-15495a488afb-kube-api-access-5vfrt\") pod \"92b276db-ac4d-4f6b-a575-15495a488afb\" (UID: \"92b276db-ac4d-4f6b-a575-15495a488afb\") " Apr 17 11:23:17.932557 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:17.932519 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b276db-ac4d-4f6b-a575-15495a488afb-kube-api-access-5vfrt" (OuterVolumeSpecName: "kube-api-access-5vfrt") pod "92b276db-ac4d-4f6b-a575-15495a488afb" (UID: "92b276db-ac4d-4f6b-a575-15495a488afb"). InnerVolumeSpecName "kube-api-access-5vfrt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:18.031120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:18.031033 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vfrt\" (UniqueName: \"kubernetes.io/projected/92b276db-ac4d-4f6b-a575-15495a488afb-kube-api-access-5vfrt\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:18.765416 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:18.765333 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7d44j" event={"ID":"92b276db-ac4d-4f6b-a575-15495a488afb","Type":"ContainerDied","Data":"2015c3050c8a22f62d45b0c24fdaf79297be7aad0f65d8cb88da687d5e16ad80"} Apr 17 11:23:18.765416 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:18.765369 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2015c3050c8a22f62d45b0c24fdaf79297be7aad0f65d8cb88da687d5e16ad80" Apr 17 11:23:18.765416 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:18.765350 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7d44j" Apr 17 11:23:19.295964 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.295930 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj"] Apr 17 11:23:19.296335 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.296249 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92b276db-ac4d-4f6b-a575-15495a488afb" containerName="s3-tls-init-custom" Apr 17 11:23:19.296335 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.296260 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b276db-ac4d-4f6b-a575-15495a488afb" containerName="s3-tls-init-custom" Apr 17 11:23:19.296335 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.296310 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="92b276db-ac4d-4f6b-a575-15495a488afb" containerName="s3-tls-init-custom" Apr 17 11:23:19.298085 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.298065 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.300672 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.300653 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 17 11:23:19.300766 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.300696 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 17 11:23:19.306064 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.306023 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj"] Apr 17 11:23:19.341522 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.341498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc28\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-kube-api-access-xzc28\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.341652 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.341553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f667b56b-15a0-468a-9096-01b5dcfbe6f5-data\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.341652 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.341617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.442820 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.442787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f667b56b-15a0-468a-9096-01b5dcfbe6f5-data\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.442972 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.442833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.442972 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.442874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc28\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-kube-api-access-xzc28\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.443041 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:23:19.443025 2571 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 17 11:23:19.443072 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:23:19.443042 2571 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj: secret "seaweedfs-tls-serving" not found Apr 17 11:23:19.443120 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:23:19.443109 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-seaweedfs-tls-serving podName:f667b56b-15a0-468a-9096-01b5dcfbe6f5 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:19.943087859 +0000 UTC m=+416.074864529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-cvxsj" (UID: "f667b56b-15a0-468a-9096-01b5dcfbe6f5") : secret "seaweedfs-tls-serving" not found Apr 17 11:23:19.443177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.443153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f667b56b-15a0-468a-9096-01b5dcfbe6f5-data\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.452439 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.452412 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc28\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-kube-api-access-xzc28\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.947034 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.946981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:19.949527 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:19.949504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f667b56b-15a0-468a-9096-01b5dcfbe6f5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cvxsj\" (UID: \"f667b56b-15a0-468a-9096-01b5dcfbe6f5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:20.208009 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:20.207915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" Apr 17 11:23:20.322454 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:20.322418 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj"] Apr 17 11:23:20.326349 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:20.326320 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf667b56b_15a0_468a_9096_01b5dcfbe6f5.slice/crio-bb178b063fc93db041d24033492a376a8de0661d33f838c8572698db0f8f3052 WatchSource:0}: Error finding container bb178b063fc93db041d24033492a376a8de0661d33f838c8572698db0f8f3052: Status 404 returned error can't find the container with id bb178b063fc93db041d24033492a376a8de0661d33f838c8572698db0f8f3052 Apr 17 11:23:20.772237 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:20.772196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" event={"ID":"f667b56b-15a0-468a-9096-01b5dcfbe6f5","Type":"ContainerStarted","Data":"de23522581be15c0b6d3bfc2388dde6755cb2da14cae930c1c9bbc86c46dc12d"} Apr 17 11:23:20.772237 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:20.772238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" event={"ID":"f667b56b-15a0-468a-9096-01b5dcfbe6f5","Type":"ContainerStarted","Data":"bb178b063fc93db041d24033492a376a8de0661d33f838c8572698db0f8f3052"} Apr 17 11:23:20.787949 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:20.787895 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cvxsj" podStartSLOduration=1.554820405 podStartE2EDuration="1.787880517s" podCreationTimestamp="2026-04-17 11:23:19 +0000 UTC" firstStartedPulling="2026-04-17 11:23:20.327553255 +0000 UTC m=+416.459329920" lastFinishedPulling="2026-04-17 11:23:20.560613367 +0000 UTC m=+416.692390032" observedRunningTime="2026-04-17 11:23:20.7875703 +0000 UTC m=+416.919346989" watchObservedRunningTime="2026-04-17 11:23:20.787880517 +0000 UTC m=+416.919657205" Apr 17 11:23:21.280668 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.280633 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-fcww6"] Apr 17 11:23:21.283189 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.283173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:21.290509 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.290483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-fcww6"] Apr 17 11:23:21.359231 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.359169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wmf\" (UniqueName: \"kubernetes.io/projected/4d418658-856e-4bb9-aaa9-95fd9c578e68-kube-api-access-c8wmf\") pod \"s3-tls-init-serving-fcww6\" (UID: \"4d418658-856e-4bb9-aaa9-95fd9c578e68\") " pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:21.460273 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.460235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wmf\" (UniqueName: \"kubernetes.io/projected/4d418658-856e-4bb9-aaa9-95fd9c578e68-kube-api-access-c8wmf\") pod \"s3-tls-init-serving-fcww6\" (UID: \"4d418658-856e-4bb9-aaa9-95fd9c578e68\") " pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:21.468711 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.468684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wmf\" (UniqueName: \"kubernetes.io/projected/4d418658-856e-4bb9-aaa9-95fd9c578e68-kube-api-access-c8wmf\") pod \"s3-tls-init-serving-fcww6\" (UID: \"4d418658-856e-4bb9-aaa9-95fd9c578e68\") " pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:21.602833 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.602756 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:21.719307 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.719282 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-fcww6"] Apr 17 11:23:21.721516 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:21.721488 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d418658_856e_4bb9_aaa9_95fd9c578e68.slice/crio-9f066c1a75b7e1ca328bf74bedee1c901185f3a7dfbd63297263a5d2639756af WatchSource:0}: Error finding container 9f066c1a75b7e1ca328bf74bedee1c901185f3a7dfbd63297263a5d2639756af: Status 404 returned error can't find the container with id 9f066c1a75b7e1ca328bf74bedee1c901185f3a7dfbd63297263a5d2639756af Apr 17 11:23:21.775612 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:21.775584 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fcww6" event={"ID":"4d418658-856e-4bb9-aaa9-95fd9c578e68","Type":"ContainerStarted","Data":"9f066c1a75b7e1ca328bf74bedee1c901185f3a7dfbd63297263a5d2639756af"} Apr 17 11:23:22.779938 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:22.779903 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fcww6" event={"ID":"4d418658-856e-4bb9-aaa9-95fd9c578e68","Type":"ContainerStarted","Data":"b367b5a693844739d0c2c867971d10b927304f8727359748a0a2a3553c26504f"} Apr 17 11:23:22.796771 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:22.796701 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-fcww6" podStartSLOduration=1.796686853 podStartE2EDuration="1.796686853s" podCreationTimestamp="2026-04-17 11:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:23:22.795162012 +0000 UTC m=+418.926938698" watchObservedRunningTime="2026-04-17 11:23:22.796686853 +0000 UTC m=+418.928463539" Apr 17 11:23:25.792767 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:25.792733 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d418658-856e-4bb9-aaa9-95fd9c578e68" containerID="b367b5a693844739d0c2c867971d10b927304f8727359748a0a2a3553c26504f" exitCode=0 Apr 17 11:23:25.793177 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:25.792807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fcww6" event={"ID":"4d418658-856e-4bb9-aaa9-95fd9c578e68","Type":"ContainerDied","Data":"b367b5a693844739d0c2c867971d10b927304f8727359748a0a2a3553c26504f"} Apr 17 11:23:26.916510 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:26.916488 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:27.007687 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:27.007652 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8wmf\" (UniqueName: \"kubernetes.io/projected/4d418658-856e-4bb9-aaa9-95fd9c578e68-kube-api-access-c8wmf\") pod \"4d418658-856e-4bb9-aaa9-95fd9c578e68\" (UID: \"4d418658-856e-4bb9-aaa9-95fd9c578e68\") " Apr 17 11:23:27.009569 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:27.009540 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d418658-856e-4bb9-aaa9-95fd9c578e68-kube-api-access-c8wmf" (OuterVolumeSpecName: "kube-api-access-c8wmf") pod "4d418658-856e-4bb9-aaa9-95fd9c578e68" (UID: "4d418658-856e-4bb9-aaa9-95fd9c578e68"). InnerVolumeSpecName "kube-api-access-c8wmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:27.108935 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:27.108854 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8wmf\" (UniqueName: \"kubernetes.io/projected/4d418658-856e-4bb9-aaa9-95fd9c578e68-kube-api-access-c8wmf\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:27.799753 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:27.799708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fcww6" event={"ID":"4d418658-856e-4bb9-aaa9-95fd9c578e68","Type":"ContainerDied","Data":"9f066c1a75b7e1ca328bf74bedee1c901185f3a7dfbd63297263a5d2639756af"} Apr 17 11:23:27.799753 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:27.799733 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fcww6" Apr 17 11:23:27.799753 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:27.799744 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f066c1a75b7e1ca328bf74bedee1c901185f3a7dfbd63297263a5d2639756af" Apr 17 11:23:31.037255 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.037222 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2dsqw/must-gather-w9czj"] Apr 17 11:23:31.037659 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.037573 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d418658-856e-4bb9-aaa9-95fd9c578e68" containerName="s3-tls-init-serving" Apr 17 11:23:31.037659 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.037585 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d418658-856e-4bb9-aaa9-95fd9c578e68" containerName="s3-tls-init-serving" Apr 17 11:23:31.037659 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.037637 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d418658-856e-4bb9-aaa9-95fd9c578e68" containerName="s3-tls-init-serving" Apr 17 11:23:31.063898 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.063866 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2dsqw/must-gather-w9czj"] Apr 17 11:23:31.064043 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.063929 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.067596 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.067573 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2dsqw\"/\"default-dockercfg-nrgjr\"" Apr 17 11:23:31.068644 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.068625 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2dsqw\"/\"kube-root-ca.crt\"" Apr 17 11:23:31.068854 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.068637 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2dsqw\"/\"openshift-service-ca.crt\"" Apr 17 11:23:31.143118 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.143083 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/962d1c27-054c-48ff-b50d-be979bf6f753-must-gather-output\") pod \"must-gather-w9czj\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.143307 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.143140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlh6\" (UniqueName: \"kubernetes.io/projected/962d1c27-054c-48ff-b50d-be979bf6f753-kube-api-access-7hlh6\") pod \"must-gather-w9czj\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.243965 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.243928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/962d1c27-054c-48ff-b50d-be979bf6f753-must-gather-output\") pod \"must-gather-w9czj\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.244133 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.243992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlh6\" (UniqueName: \"kubernetes.io/projected/962d1c27-054c-48ff-b50d-be979bf6f753-kube-api-access-7hlh6\") pod \"must-gather-w9czj\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.244338 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.244317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/962d1c27-054c-48ff-b50d-be979bf6f753-must-gather-output\") pod \"must-gather-w9czj\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.253349 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.253323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlh6\" (UniqueName: \"kubernetes.io/projected/962d1c27-054c-48ff-b50d-be979bf6f753-kube-api-access-7hlh6\") pod \"must-gather-w9czj\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.387208 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.387128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:31.510196 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.510167 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2dsqw/must-gather-w9czj"] Apr 17 11:23:31.512580 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:31.512553 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod962d1c27_054c_48ff_b50d_be979bf6f753.slice/crio-02a6984ed29caf4eb5d07aae0617f32e4b286e674ae92fea890ab1c7d9679f73 WatchSource:0}: Error finding container 02a6984ed29caf4eb5d07aae0617f32e4b286e674ae92fea890ab1c7d9679f73: Status 404 returned error can't find the container with id 02a6984ed29caf4eb5d07aae0617f32e4b286e674ae92fea890ab1c7d9679f73 Apr 17 11:23:31.812266 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:31.812231 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dsqw/must-gather-w9czj" event={"ID":"962d1c27-054c-48ff-b50d-be979bf6f753","Type":"ContainerStarted","Data":"02a6984ed29caf4eb5d07aae0617f32e4b286e674ae92fea890ab1c7d9679f73"} Apr 17 11:23:36.832077 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:36.832039 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dsqw/must-gather-w9czj" event={"ID":"962d1c27-054c-48ff-b50d-be979bf6f753","Type":"ContainerStarted","Data":"e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c"} Apr 17 11:23:36.832077 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:36.832077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dsqw/must-gather-w9czj" event={"ID":"962d1c27-054c-48ff-b50d-be979bf6f753","Type":"ContainerStarted","Data":"79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955"} Apr 17 11:23:36.848260 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:36.848175 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2dsqw/must-gather-w9czj" podStartSLOduration=1.146995697 podStartE2EDuration="5.848160898s" podCreationTimestamp="2026-04-17 11:23:31 +0000 UTC" firstStartedPulling="2026-04-17 11:23:31.514308289 +0000 UTC m=+427.646084954" lastFinishedPulling="2026-04-17 11:23:36.215473489 +0000 UTC m=+432.347250155" observedRunningTime="2026-04-17 11:23:36.847160663 +0000 UTC m=+432.978937351" watchObservedRunningTime="2026-04-17 11:23:36.848160898 +0000 UTC m=+432.979937585" Apr 17 11:23:52.887744 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:52.887710 2571 generic.go:358] "Generic (PLEG): container finished" podID="962d1c27-054c-48ff-b50d-be979bf6f753" containerID="79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955" exitCode=0 Apr 17 11:23:52.888178 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:52.887785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dsqw/must-gather-w9czj" event={"ID":"962d1c27-054c-48ff-b50d-be979bf6f753","Type":"ContainerDied","Data":"79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955"} Apr 17 11:23:52.888178 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:52.888115 2571 scope.go:117] "RemoveContainer" containerID="79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955" Apr 17 11:23:53.164885 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.164804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2dsqw_must-gather-w9czj_962d1c27-054c-48ff-b50d-be979bf6f753/gather/0.log" Apr 17 11:23:53.746434 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.746398 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s657/must-gather-4cvt7"] Apr 17 11:23:53.748847 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.748831 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:53.751626 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.751606 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s657\"/\"kube-root-ca.crt\"" Apr 17 11:23:53.751741 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.751640 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s657\"/\"openshift-service-ca.crt\"" Apr 17 11:23:53.770847 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.770814 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s657/must-gather-4cvt7"] Apr 17 11:23:53.844028 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.843986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70db82b8-6c7d-4a2d-8094-c9f01180e6c9-must-gather-output\") pod \"must-gather-4cvt7\" (UID: \"70db82b8-6c7d-4a2d-8094-c9f01180e6c9\") " pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:53.844255 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.844123 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvtg\" (UniqueName: \"kubernetes.io/projected/70db82b8-6c7d-4a2d-8094-c9f01180e6c9-kube-api-access-gnvtg\") pod \"must-gather-4cvt7\" (UID: \"70db82b8-6c7d-4a2d-8094-c9f01180e6c9\") " pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:53.944879 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.944846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70db82b8-6c7d-4a2d-8094-c9f01180e6c9-must-gather-output\") pod \"must-gather-4cvt7\" (UID: \"70db82b8-6c7d-4a2d-8094-c9f01180e6c9\") " pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:53.945361 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.944914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvtg\" (UniqueName: \"kubernetes.io/projected/70db82b8-6c7d-4a2d-8094-c9f01180e6c9-kube-api-access-gnvtg\") pod \"must-gather-4cvt7\" (UID: \"70db82b8-6c7d-4a2d-8094-c9f01180e6c9\") " pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:53.945361 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.945275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70db82b8-6c7d-4a2d-8094-c9f01180e6c9-must-gather-output\") pod \"must-gather-4cvt7\" (UID: \"70db82b8-6c7d-4a2d-8094-c9f01180e6c9\") " pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:53.957530 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:53.957499 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvtg\" (UniqueName: \"kubernetes.io/projected/70db82b8-6c7d-4a2d-8094-c9f01180e6c9-kube-api-access-gnvtg\") pod \"must-gather-4cvt7\" (UID: \"70db82b8-6c7d-4a2d-8094-c9f01180e6c9\") " pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:54.057746 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:54.057654 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s657/must-gather-4cvt7" Apr 17 11:23:54.192442 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:54.192261 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s657/must-gather-4cvt7"] Apr 17 11:23:54.195243 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:23:54.195201 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70db82b8_6c7d_4a2d_8094_c9f01180e6c9.slice/crio-6d28ff7a8995e741d0a430dc250d25330dfb7bf74235d405b9a0e83dd5b98b9e WatchSource:0}: Error finding container 6d28ff7a8995e741d0a430dc250d25330dfb7bf74235d405b9a0e83dd5b98b9e: Status 404 returned error can't find the container with id 6d28ff7a8995e741d0a430dc250d25330dfb7bf74235d405b9a0e83dd5b98b9e Apr 17 11:23:54.894847 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:54.894810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/must-gather-4cvt7" event={"ID":"70db82b8-6c7d-4a2d-8094-c9f01180e6c9","Type":"ContainerStarted","Data":"6d28ff7a8995e741d0a430dc250d25330dfb7bf74235d405b9a0e83dd5b98b9e"} Apr 17 11:23:55.899615 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:55.899580 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/must-gather-4cvt7" event={"ID":"70db82b8-6c7d-4a2d-8094-c9f01180e6c9","Type":"ContainerStarted","Data":"412b77dd4e575689545a4fdb548f7feaa31ecf87a5dfba4485cf04fd8092c64e"} Apr 17 11:23:55.899977 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:55.899621 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/must-gather-4cvt7" event={"ID":"70db82b8-6c7d-4a2d-8094-c9f01180e6c9","Type":"ContainerStarted","Data":"f837007b90cbf90c2dab9c0bda44f12c041e28ba5d7bd0d69af2d0cea41d515c"} Apr 17 11:23:55.928768 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:55.928718 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s657/must-gather-4cvt7" podStartSLOduration=2.135165108 podStartE2EDuration="2.928700976s" podCreationTimestamp="2026-04-17 11:23:53 +0000 UTC" firstStartedPulling="2026-04-17 11:23:54.197383462 +0000 UTC m=+450.329160135" lastFinishedPulling="2026-04-17 11:23:54.990919338 +0000 UTC m=+451.122696003" observedRunningTime="2026-04-17 11:23:55.92827815 +0000 UTC m=+452.060054838" watchObservedRunningTime="2026-04-17 11:23:55.928700976 +0000 UTC m=+452.060477663" Apr 17 11:23:56.771246 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:56.771204 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-54fdx_d9c775e3-9dbe-4167-ae13-e921d68e51d8/global-pull-secret-syncer/0.log" Apr 17 11:23:57.121703 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:57.121618 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wk5dr_823b0362-e067-41b2-b7ac-63303987f87e/konnectivity-agent/0.log" Apr 17 11:23:57.336113 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:57.336085 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-136.ec2.internal_57b16f66466a195429f73bc1a0dcec09/haproxy/0.log" Apr 17 11:23:58.517533 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.517498 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2dsqw/must-gather-w9czj"] Apr 17 11:23:58.517982 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.517819 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-2dsqw/must-gather-w9czj" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="copy" containerID="cri-o://e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c" gracePeriod=2 Apr 17 11:23:58.525422 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.525405 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2dsqw/must-gather-w9czj"] Apr 17 11:23:58.885063 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.885021 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2dsqw_must-gather-w9czj_962d1c27-054c-48ff-b50d-be979bf6f753/copy/0.log" Apr 17 11:23:58.885558 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.885524 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:58.888441 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.888396 2571 status_manager.go:895] "Failed to get status for pod" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" pod="openshift-must-gather-2dsqw/must-gather-w9czj" err="pods \"must-gather-w9czj\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2dsqw\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 17 11:23:58.904234 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.902479 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/962d1c27-054c-48ff-b50d-be979bf6f753-must-gather-output\") pod \"962d1c27-054c-48ff-b50d-be979bf6f753\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " Apr 17 11:23:58.904234 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.902595 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hlh6\" (UniqueName: \"kubernetes.io/projected/962d1c27-054c-48ff-b50d-be979bf6f753-kube-api-access-7hlh6\") pod \"962d1c27-054c-48ff-b50d-be979bf6f753\" (UID: \"962d1c27-054c-48ff-b50d-be979bf6f753\") " Apr 17 11:23:58.907581 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.907527 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962d1c27-054c-48ff-b50d-be979bf6f753-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "962d1c27-054c-48ff-b50d-be979bf6f753" (UID: "962d1c27-054c-48ff-b50d-be979bf6f753"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:23:58.922419 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.922371 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2dsqw_must-gather-w9czj_962d1c27-054c-48ff-b50d-be979bf6f753/copy/0.log" Apr 17 11:23:58.923059 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.923034 2571 generic.go:358] "Generic (PLEG): container finished" podID="962d1c27-054c-48ff-b50d-be979bf6f753" containerID="e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c" exitCode=143 Apr 17 11:23:58.923379 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.923365 2571 scope.go:117] "RemoveContainer" containerID="e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c" Apr 17 11:23:58.923702 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.923685 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dsqw/must-gather-w9czj" Apr 17 11:23:58.926641 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.926602 2571 status_manager.go:895] "Failed to get status for pod" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" pod="openshift-must-gather-2dsqw/must-gather-w9czj" err="pods \"must-gather-w9czj\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2dsqw\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 17 11:23:58.935079 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.934680 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962d1c27-054c-48ff-b50d-be979bf6f753-kube-api-access-7hlh6" (OuterVolumeSpecName: "kube-api-access-7hlh6") pod "962d1c27-054c-48ff-b50d-be979bf6f753" (UID: "962d1c27-054c-48ff-b50d-be979bf6f753"). InnerVolumeSpecName "kube-api-access-7hlh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:58.954654 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.954337 2571 scope.go:117] "RemoveContainer" containerID="79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955" Apr 17 11:23:58.981570 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.981262 2571 scope.go:117] "RemoveContainer" containerID="e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c" Apr 17 11:23:58.982069 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:23:58.981818 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c\": container with ID starting with e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c not found: ID does not exist" containerID="e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c" Apr 17 11:23:58.982069 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.981854 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c"} err="failed to get container status \"e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c\": rpc error: code = NotFound desc = could not find container \"e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c\": container with ID starting with e4fa994b05b4533f604e5a26f2d6e8835219ff3777bb15f599eb00e86e784b6c not found: ID does not exist" Apr 17 11:23:58.982069 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.981880 2571 scope.go:117] "RemoveContainer" containerID="79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955" Apr 17 11:23:58.982539 ip-10-0-139-136 kubenswrapper[2571]: E0417 11:23:58.982447 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955\": container with ID starting with 79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955 not found: ID does not exist" containerID="79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955" Apr 17 11:23:58.982539 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:58.982475 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955"} err="failed to get container status \"79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955\": rpc error: code = NotFound desc = could not find container \"79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955\": container with ID starting with 79b864efe5106e3559748008e5ecb7ece79c972f065de3403f85326b90c92955 not found: ID does not exist" Apr 17 11:23:59.003493 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:59.003412 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hlh6\" (UniqueName: \"kubernetes.io/projected/962d1c27-054c-48ff-b50d-be979bf6f753-kube-api-access-7hlh6\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:59.003493 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:59.003450 2571 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/962d1c27-054c-48ff-b50d-be979bf6f753-must-gather-output\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 17 11:23:59.239249 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:23:59.239192 2571 status_manager.go:895] "Failed to get status for pod" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" pod="openshift-must-gather-2dsqw/must-gather-w9czj" err="pods \"must-gather-w9czj\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-2dsqw\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 17 11:24:00.025365 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.024551 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/alertmanager/0.log" Apr 17 11:24:00.062709 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.062672 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/config-reloader/0.log" Apr 17 11:24:00.093943 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.093908 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/kube-rbac-proxy-web/0.log" Apr 17 11:24:00.140554 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.140521 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/kube-rbac-proxy/0.log" Apr 17 11:24:00.170171 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.170140 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/kube-rbac-proxy-metric/0.log" Apr 17 11:24:00.202385 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.202349 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/prom-label-proxy/0.log" Apr 17 11:24:00.228959 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.228915 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_36b27e94-81e6-4770-b057-9405a36d62a7/init-config-reloader/0.log" Apr 17 11:24:00.302362 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.302269 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dglc7_4ab495b3-f50d-4e6d-b409-0e248fd3eb5d/kube-state-metrics/0.log" Apr 17 11:24:00.358599 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.358566 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dglc7_4ab495b3-f50d-4e6d-b409-0e248fd3eb5d/kube-rbac-proxy-main/0.log" Apr 17 11:24:00.428385 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.428347 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dglc7_4ab495b3-f50d-4e6d-b409-0e248fd3eb5d/kube-rbac-proxy-self/0.log" Apr 17 11:24:00.510630 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.510596 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" path="/var/lib/kubelet/pods/962d1c27-054c-48ff-b50d-be979bf6f753/volumes" Apr 17 11:24:00.588666 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.588595 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-mprdx_5cd93e96-9968-4a9d-99fc-66831dcd7f1f/monitoring-plugin/0.log" Apr 17 11:24:00.698397 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.698369 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-clrnz_6aa6316d-30a4-4962-abd4-b78bcc544001/node-exporter/0.log" Apr 17 11:24:00.720639 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.720597 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-clrnz_6aa6316d-30a4-4962-abd4-b78bcc544001/kube-rbac-proxy/0.log" Apr 17 11:24:00.747674 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.747644 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-clrnz_6aa6316d-30a4-4962-abd4-b78bcc544001/init-textfile/0.log" Apr 17 11:24:00.881308 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.881202 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qlcxv_21ac4b07-6a87-445d-bdfe-0ea94f7f73eb/kube-rbac-proxy-main/0.log" Apr 17 11:24:00.907940 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.907904 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qlcxv_21ac4b07-6a87-445d-bdfe-0ea94f7f73eb/kube-rbac-proxy-self/0.log" Apr 17 11:24:00.936465 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.936440 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qlcxv_21ac4b07-6a87-445d-bdfe-0ea94f7f73eb/openshift-state-metrics/0.log" Apr 17 11:24:00.993606 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:00.993573 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/prometheus/0.log" Apr 17 11:24:01.021353 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:01.021324 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/config-reloader/0.log" Apr 17 11:24:01.047908 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:01.047875 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/thanos-sidecar/0.log" Apr 17 11:24:01.074829 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:01.074800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/kube-rbac-proxy-web/0.log" Apr 17 11:24:01.096197 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:01.096168 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/kube-rbac-proxy/0.log" Apr 17 11:24:01.120590 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:01.120560 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/kube-rbac-proxy-thanos/0.log" Apr 17 11:24:01.147755 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:01.147683 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90e4e187-064b-4359-8536-37e5cfdf4231/init-config-reloader/0.log" Apr 17 11:24:04.531075 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531041 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj"] Apr 17 11:24:04.531647 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531503 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="gather" Apr 17 11:24:04.531647 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531521 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="gather" Apr 17 11:24:04.531647 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531558 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="copy" Apr 17 11:24:04.531647 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531567 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="copy" Apr 17 11:24:04.531843 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531656 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="gather" Apr 17 11:24:04.531843 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.531671 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="962d1c27-054c-48ff-b50d-be979bf6f753" containerName="copy" Apr 17 11:24:04.535946 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.535917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.539683 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.539262 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7s657\"/\"default-dockercfg-7h8qt\"" Apr 17 11:24:04.549146 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.549124 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj"] Apr 17 11:24:04.555446 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.555425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-proc\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.555576 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.555472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2292t\" (UniqueName: \"kubernetes.io/projected/a321a1cf-20f6-47d9-9255-f918b451ce3e-kube-api-access-2292t\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.555635 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.555577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-sys\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.555635 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.555608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-podres\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.555733 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.555656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-lib-modules\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.656918 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.656883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2292t\" (UniqueName: \"kubernetes.io/projected/a321a1cf-20f6-47d9-9255-f918b451ce3e-kube-api-access-2292t\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-sys\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657053 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-podres\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-lib-modules\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-proc\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-sys\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657250 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-proc\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-podres\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.657311 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.657305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a321a1cf-20f6-47d9-9255-f918b451ce3e-lib-modules\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.671175 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.671140 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2292t\" (UniqueName: \"kubernetes.io/projected/a321a1cf-20f6-47d9-9255-f918b451ce3e-kube-api-access-2292t\") pod \"perf-node-gather-daemonset-kzknj\" (UID: \"a321a1cf-20f6-47d9-9255-f918b451ce3e\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.711114 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.711083 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7t99h_3064d784-d81b-4df1-ad85-09f7ec7037db/dns/0.log" Apr 17 11:24:04.737811 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.737779 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7t99h_3064d784-d81b-4df1-ad85-09f7ec7037db/kube-rbac-proxy/0.log" Apr 17 11:24:04.848704 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.848634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:04.896725 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:04.896702 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rd6gl_99e275f8-856c-4b32-9170-b68141483240/dns-node-resolver/0.log" Apr 17 11:24:05.015065 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:05.015042 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj"] Apr 17 11:24:05.017711 ip-10-0-139-136 kubenswrapper[2571]: W0417 11:24:05.017677 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda321a1cf_20f6_47d9_9255_f918b451ce3e.slice/crio-8e7eee8ff55b0c88307b91b5c5182f1357c0c848e576734064bff666e9c268db WatchSource:0}: Error finding container 8e7eee8ff55b0c88307b91b5c5182f1357c0c848e576734064bff666e9c268db: Status 404 returned error can't find the container with id 8e7eee8ff55b0c88307b91b5c5182f1357c0c848e576734064bff666e9c268db Apr 17 11:24:05.378934 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:05.378895 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j5ljp_d325f4f2-5692-4489-9be3-cb5403f6b917/node-ca/0.log" Apr 17 11:24:05.953846 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:05.953798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" event={"ID":"a321a1cf-20f6-47d9-9255-f918b451ce3e","Type":"ContainerStarted","Data":"e2f0d9739221d109b55ebcf35e019297dd0da038d6c43b99677a0ef83d879194"} Apr 17 11:24:05.953846 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:05.953849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" event={"ID":"a321a1cf-20f6-47d9-9255-f918b451ce3e","Type":"ContainerStarted","Data":"8e7eee8ff55b0c88307b91b5c5182f1357c0c848e576734064bff666e9c268db"} Apr 17 11:24:05.954403 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:05.953950 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:05.973083 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:05.973039 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" podStartSLOduration=1.9730250630000001 podStartE2EDuration="1.973025063s" podCreationTimestamp="2026-04-17 11:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:24:05.971918401 +0000 UTC m=+462.103695088" watchObservedRunningTime="2026-04-17 11:24:05.973025063 +0000 UTC m=+462.104801749" Apr 17 11:24:06.470656 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:06.470625 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tjbc7_586b9443-fc3c-482e-be48-5fd2e6e6cbe4/serve-healthcheck-canary/0.log" Apr 17 11:24:06.842206 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:06.842123 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5m4mb_64efb360-ec45-436d-87b0-6cd63e034c78/kube-rbac-proxy/0.log" Apr 17 11:24:06.860544 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:06.860503 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5m4mb_64efb360-ec45-436d-87b0-6cd63e034c78/exporter/0.log" Apr 17 11:24:06.879785 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:06.879758 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5m4mb_64efb360-ec45-436d-87b0-6cd63e034c78/extractor/0.log" Apr 17 11:24:08.976600 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:08.976570 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-rm476_4ba10494-e1c2-44de-854e-2011424b1a9b/manager/0.log" Apr 17 11:24:08.994687 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:08.994636 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-stvql_1b608843-b9ab-4137-af3d-1380091da4d0/s3-init/0.log" Apr 17 11:24:09.016613 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:09.016573 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-7d44j_92b276db-ac4d-4f6b-a575-15495a488afb/s3-tls-init-custom/0.log" Apr 17 11:24:09.037181 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:09.037142 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-fcww6_4d418658-856e-4bb9-aaa9-95fd9c578e68/s3-tls-init-serving/0.log" Apr 17 11:24:09.064158 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:09.064134 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-76srm_1a135322-5f7f-4fea-a693-49ac4d55f176/seaweedfs/0.log" Apr 17 11:24:09.087478 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:09.087446 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-gvbp9_c0af8ff0-37fe-471a-9a37-e5e8cb2cfaf0/seaweedfs-tls-custom/0.log" Apr 17 11:24:09.110490 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:09.110457 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-cvxsj_f667b56b-15a0-468a-9096-01b5dcfbe6f5/seaweedfs-tls-serving/0.log" Apr 17 11:24:11.967013 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:11.966985 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-kzknj" Apr 17 11:24:14.524477 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.524402 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/kube-multus-additional-cni-plugins/0.log" Apr 17 11:24:14.545123 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.545091 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/egress-router-binary-copy/0.log" Apr 17 11:24:14.566409 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.566380 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/cni-plugins/0.log" Apr 17 11:24:14.586523 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.586499 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/bond-cni-plugin/0.log" Apr 17 11:24:14.606136 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.606114 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/routeoverride-cni/0.log" Apr 17 11:24:14.626862 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.626834 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/whereabouts-cni-bincopy/0.log" Apr 17 11:24:14.644839 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.644817 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bq85c_924b7a18-8af5-4d88-b11b-b9df79a3809c/whereabouts-cni/0.log" Apr 17 11:24:14.840578 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.840509 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q9pfr_5f80200a-a69d-4400-b71b-efebc2ef29c6/kube-multus/0.log" Apr 17 11:24:14.982120 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:14.982089 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fcgzf_1b1c44fb-0716-4e07-9409-72264a348f29/network-metrics-daemon/0.log" Apr 17 11:24:15.004554 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:15.004527 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fcgzf_1b1c44fb-0716-4e07-9409-72264a348f29/kube-rbac-proxy/0.log" Apr 17 11:24:16.101194 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.101163 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/ovn-controller/0.log" Apr 17 11:24:16.121692 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.121665 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/ovn-acl-logging/0.log" Apr 17 11:24:16.137996 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.137974 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/kube-rbac-proxy-node/0.log" Apr 17 11:24:16.157262 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.157233 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:24:16.177850 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.177823 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/northd/0.log" Apr 17 11:24:16.197763 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.197733 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/nbdb/0.log" Apr 17 11:24:16.219429 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.219350 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/sbdb/0.log" Apr 17 11:24:16.416585 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:16.416555 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nj28v_45d5b2c3-f6da-4326-a766-68dc042f85ef/ovnkube-controller/0.log" Apr 17 11:24:17.781165 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:17.781138 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-82xvr_97b2b03f-63f9-420a-8e36-4e191f507077/network-check-target-container/0.log" Apr 17 11:24:18.768061 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:18.768036 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h527p_37c3dfdf-e797-46f5-bc3b-3ec9329d15fe/iptables-alerter/0.log" Apr 17 11:24:19.422259 ip-10-0-139-136 kubenswrapper[2571]: I0417 11:24:19.422205 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7st2w_9be12e9c-2a20-44bd-8808-6135db522a47/tuned/0.log"