Apr 23 09:28:28.548006 ip-10-0-129-154 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 09:28:28.548016 ip-10-0-129-154 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 09:28:28.548023 ip-10-0-129-154 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 09:28:28.548267 ip-10-0-129-154 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 09:28:38.615554 ip-10-0-129-154 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 09:28:38.615571 ip-10-0-129-154 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1e9d72c7ed2b4197a1d5cf67b7b51303 -- Apr 23 09:30:46.601535 ip-10-0-129-154 systemd[1]: Starting Kubernetes Kubelet... Apr 23 09:30:47.115276 ip-10-0-129-154 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:47.115276 ip-10-0-129-154 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 09:30:47.115276 ip-10-0-129-154 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:47.115276 ip-10-0-129-154 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 09:30:47.115276 ip-10-0-129-154 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:47.116450 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.116356 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 09:30:47.120745 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120723 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:47.120745 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120743 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:47.120745 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120747 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120751 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120754 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120756 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120759 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120762 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120766 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120768 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120771 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120773 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120776 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120779 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120782 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120784 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120791 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120794 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120797 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120799 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120802 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:47.120845 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120806 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120809 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120812 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120815 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120818 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120820 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120823 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120826 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120828 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120831 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120834 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120836 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120839 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120841 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120845 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120849 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120852 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120856 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120858 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120861 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:47.121334 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120864 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120866 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120870 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120872 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120875 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120877 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120880 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120882 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120885 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120888 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120890 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120893 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120896 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120898 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120901 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120904 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120906 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120910 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120912 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120915 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:47.121828 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120917 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120920 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120922 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120925 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120928 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120930 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120933 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120937 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120941 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120945 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120947 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120950 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120953 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120955 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120958 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120961 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120964 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120969 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120972 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:47.122335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120976 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120978 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120983 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120986 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120989 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.120992 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121419 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121425 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121428 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121431 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121435 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121438 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121440 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121443 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121446 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121449 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121452 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121455 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121458 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:47.122800 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121461 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121464 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121468 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121470 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121473 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121476 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121479 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121481 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121484 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121487 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121489 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121492 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121494 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121497 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121499 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121502 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121504 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121507 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121509 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121512 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:47.123277 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121514 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121518 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121520 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121523 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121525 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121528 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121531 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121533 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121535 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121538 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121541 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121543 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121546 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121549 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121551 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121554 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121558 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121560 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121563 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121566 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:47.123793 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121568 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121571 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121573 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121575 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121578 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121581 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121583 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121586 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121588 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121592 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121595 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121598 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121600 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121604 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121607 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121609 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121612 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121614 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121617 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121620 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:47.124327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121623 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121626 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121628 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121631 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121635 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121638 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121641 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121645 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121647 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121650 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121652 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121655 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.121657 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121738 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121745 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121753 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121757 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121763 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121766 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121771 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 09:30:47.124858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121776 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121779 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121782 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121785 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121789 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121792 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121795 2575 flags.go:64] FLAG: --cgroup-root="" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121798 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121801 2575 flags.go:64] FLAG: --client-ca-file="" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121804 2575 flags.go:64] FLAG: --cloud-config="" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121807 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121810 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121815 2575 flags.go:64] FLAG: --cluster-domain="" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121818 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121821 2575 flags.go:64] FLAG: --config-dir="" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121824 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121828 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121832 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121835 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121839 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121844 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121848 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121851 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121854 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121857 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 09:30:47.125441 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121860 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121865 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121868 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121870 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121873 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121877 2575 flags.go:64] FLAG: --enable-server="true" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121880 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121885 2575 flags.go:64] FLAG: --event-burst="100" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121888 2575 flags.go:64] FLAG: --event-qps="50" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121891 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121895 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121898 2575 flags.go:64] FLAG: --eviction-hard="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121902 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121905 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121908 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121911 2575 flags.go:64] FLAG: --eviction-soft="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121914 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121917 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121919 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121923 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121926 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121929 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121932 2575 flags.go:64] FLAG: --feature-gates="" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121936 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121939 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 09:30:47.126048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121943 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121946 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121950 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121954 2575 flags.go:64] FLAG: --help="false" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121957 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121960 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121963 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121966 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121970 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121974 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121977 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121980 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121983 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121986 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121989 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121992 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121995 2575 flags.go:64] FLAG: --kube-reserved="" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.121998 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122001 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122008 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122011 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122014 2575 flags.go:64] FLAG: --lock-file="" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122017 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122020 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 09:30:47.126712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122023 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122029 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122033 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122036 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122039 2575 flags.go:64] FLAG: --logging-format="text" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122042 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122046 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122049 2575 flags.go:64] FLAG: --manifest-url="" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122052 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122057 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122060 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122064 2575 flags.go:64] FLAG: --max-pods="110" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122078 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122082 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122085 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122088 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122091 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122094 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122097 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122106 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122110 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122113 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122116 2575 flags.go:64] FLAG: --pod-cidr="" Apr 23 09:30:47.127334 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122119 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122125 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122128 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122132 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122135 2575 flags.go:64] FLAG: --port="10250" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122139 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122142 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00b9eac04238f40ef" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122146 2575 flags.go:64] FLAG: --qos-reserved="" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122149 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122152 2575 flags.go:64] FLAG: --register-node="true" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122155 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122158 2575 flags.go:64] FLAG: --register-with-taints="" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122162 2575 flags.go:64] FLAG: --registry-burst="10" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122165 2575 flags.go:64] FLAG: --registry-qps="5" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122168 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122170 2575 flags.go:64] FLAG: --reserved-memory="" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122189 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122193 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122196 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122199 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122202 2575 flags.go:64] FLAG: --runonce="false" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122205 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122214 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122217 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122220 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122223 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 09:30:47.127934 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122226 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122229 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122232 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122241 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122244 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122247 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122250 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122253 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122256 2575 flags.go:64] FLAG: --system-cgroups="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122259 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122264 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122268 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122271 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122276 2575 flags.go:64] FLAG: --tls-min-version="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122278 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122281 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122284 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122287 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122290 2575 flags.go:64] FLAG: --v="2" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122294 2575 flags.go:64] FLAG: --version="false" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122298 2575 flags.go:64] FLAG: --vmodule="" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122303 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.122306 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122446 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:47.128602 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122451 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122454 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122457 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122459 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122462 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122465 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122468 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122471 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122473 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122476 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122479 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122481 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122484 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122486 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122489 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122491 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122494 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122496 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122499 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:47.129275 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122503 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122505 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122508 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122510 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122513 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122517 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122521 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122524 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122527 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122530 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122533 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122536 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122539 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122542 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122544 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122546 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122549 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122551 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122554 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122557 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:47.129751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122560 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122562 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122565 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122567 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122569 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122572 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122575 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122578 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122580 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122583 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122585 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122587 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122591 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122594 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122597 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122601 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122604 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122607 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122609 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:47.130272 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122612 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122614 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122617 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122621 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122623 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122626 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122629 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122631 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122634 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122637 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122639 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122642 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122644 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122648 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122650 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122653 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122655 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122657 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122660 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122662 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:47.130938 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122665 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122667 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122670 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122672 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122675 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122679 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.122681 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.123498 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.131335 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 09:30:47.131621 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.131365 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131677 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131694 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131698 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131703 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131707 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131710 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131713 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131716 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131719 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131723 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131726 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131729 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131731 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131734 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131737 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131739 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131742 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131745 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131747 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:47.131901 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131750 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131753 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131757 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131759 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131762 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131765 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131767 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131770 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131773 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131775 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131779 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131782 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131785 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131788 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131791 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131794 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131797 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131800 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131802 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131805 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:47.132440 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131808 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131810 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131814 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131816 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131819 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131822 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131825 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131827 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131830 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131833 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131836 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131838 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131841 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131843 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131846 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131850 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131855 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131858 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131861 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:47.132932 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131864 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131867 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131871 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131874 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131877 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131880 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131883 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131886 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131889 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131892 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131895 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131898 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131901 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131904 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131907 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131909 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131912 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131915 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131920 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:47.133415 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131922 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131925 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131928 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131930 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131933 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131935 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131938 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131941 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.131943 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.131949 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132089 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132094 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132097 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132100 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132103 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:47.133894 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132106 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132108 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132111 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132114 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132116 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132120 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132123 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132126 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132128 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132131 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132134 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132137 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132139 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132141 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132144 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132147 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132149 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132153 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132156 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132159 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:47.134382 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132162 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132164 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132167 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132169 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132186 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132189 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132191 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132194 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132197 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132200 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132202 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132205 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132207 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132210 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132213 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132216 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132218 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132221 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132224 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132227 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:47.134884 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132230 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132232 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132235 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132238 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132240 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132243 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132246 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132248 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132251 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132254 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132257 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132259 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132262 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132264 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132267 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132269 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132273 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132276 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132279 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:47.135403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132282 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132284 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132287 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132290 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132292 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132295 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132297 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132300 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132303 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132305 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132308 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132311 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132313 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132316 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132319 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132322 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132324 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132328 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132331 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:47.135888 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132335 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132338 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:47.132341 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.132345 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.132483 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.134936 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.135995 2575 server.go:1019] "Starting client certificate rotation" Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.136094 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 09:30:47.136400 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.136137 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 09:30:47.166463 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.166437 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 09:30:47.172398 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.172370 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 09:30:47.187034 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.187010 2575 log.go:25] "Validated CRI v1 runtime API" Apr 23 09:30:47.192324 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.192299 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 09:30:47.193595 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.193580 2575 log.go:25] "Validated CRI v1 image API" Apr 23 09:30:47.195825 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.195810 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 09:30:47.199011 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.198991 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 be76ea9d-3143-4e5d-ad74-8676463c8c1f:/dev/nvme0n1p3 f3d00bab-14ea-4469-8099-6507ef1878ac:/dev/nvme0n1p4] Apr 23 09:30:47.199066 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.199011 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 09:30:47.205775 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.205657 2575 manager.go:217] Machine: {Timestamp:2026-04-23 09:30:47.20345479 +0000 UTC m=+0.466841932 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3116396 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28ad34f14d790f71eb0b56b0305af0 SystemUUID:ec28ad34-f14d-790f-71eb-0b56b0305af0 BootID:1e9d72c7-ed2b-4197-a1d5-cf67b7b51303 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:13:5b:92:b0:0b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:13:5b:92:b0:0b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:aa:6e:e9:b3:e3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 09:30:47.205775 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.205763 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 09:30:47.205907 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.205883 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 09:30:47.208000 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.207978 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 09:30:47.208148 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.208003 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-154.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 09:30:47.208208 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.208157 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 09:30:47.208208 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.208165 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 09:30:47.208208 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.208191 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 09:30:47.209079 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.209069 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 09:30:47.210557 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.210539 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wxk6t" Apr 23 09:30:47.210768 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.210757 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 09:30:47.210881 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.210872 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 09:30:47.214781 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.214770 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 23 09:30:47.214816 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.214784 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 09:30:47.214816 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.214796 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 09:30:47.214816 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.214805 2575 kubelet.go:397] "Adding apiserver pod source" Apr 23 09:30:47.214816 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.214814 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 09:30:47.216029 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.216008 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 09:30:47.216111 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.216069 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 09:30:47.219571 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.219554 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 09:30:47.220697 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.220679 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wxk6t" Apr 23 09:30:47.221225 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.221192 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 09:30:47.222945 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.222927 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.222959 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.222969 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.222978 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.222988 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.222998 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.223006 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.223014 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.223024 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 09:30:47.223035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.223034 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 09:30:47.223314 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.223056 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 09:30:47.223314 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.223069 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 09:30:47.225027 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.225013 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 09:30:47.225068 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.225030 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 09:30:47.228540 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.228392 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:47.229259 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.229246 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 09:30:47.229323 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.229285 2575 server.go:1295] "Started kubelet" Apr 23 09:30:47.229428 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.229378 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 09:30:47.229488 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.229421 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 09:30:47.229520 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.229498 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:47.229520 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.229515 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 09:30:47.230073 ip-10-0-129-154 systemd[1]: Started Kubernetes Kubelet. Apr 23 09:30:47.231087 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.230913 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 09:30:47.231689 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.231670 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 23 09:30:47.232314 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.232298 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-154.ec2.internal" not found Apr 23 09:30:47.238634 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.238605 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 09:30:47.239426 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.239408 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 09:30:47.239998 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.239980 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 09:30:47.240906 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.240890 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 09:30:47.240986 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.240913 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 09:30:47.240986 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.240893 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 09:30:47.241067 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.241026 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 23 09:30:47.241067 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.241034 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 23 09:30:47.241164 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.241149 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-154.ec2.internal\" not found" Apr 23 09:30:47.241619 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.241604 2575 factory.go:55] Registering systemd factory Apr 23 09:30:47.241705 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.241648 2575 factory.go:223] Registration of the systemd container factory successfully Apr 23 09:30:47.242137 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.242114 2575 factory.go:153] Registering CRI-O factory Apr 23 09:30:47.242137 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.242137 2575 factory.go:223] Registration of the crio container factory successfully Apr 23 09:30:47.242305 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.242242 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 09:30:47.242305 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.242269 2575 factory.go:103] Registering Raw factory Apr 23 09:30:47.242305 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.242286 2575 manager.go:1196] Started watching for new ooms in manager Apr 23 09:30:47.242440 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.242362 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:47.243325 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.243304 2575 manager.go:319] Starting recovery of all containers Apr 23 09:30:47.245336 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.245308 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-154.ec2.internal\" not found" node="ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.247158 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.247134 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-154.ec2.internal" not found Apr 23 09:30:47.254523 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.254508 2575 manager.go:324] Recovery completed Apr 23 09:30:47.258753 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.258740 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:47.260817 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.260801 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:47.260895 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.260830 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:47.260895 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.260841 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:47.261371 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.261360 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 09:30:47.261371 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.261370 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 09:30:47.261457 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.261406 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 09:30:47.265132 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.265116 2575 policy_none.go:49] "None policy: Start" Apr 23 09:30:47.265132 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.265135 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 09:30:47.265248 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.265146 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 23 09:30:47.314901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.314880 2575 manager.go:341] "Starting Device Plugin manager" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.315002 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315017 2575 server.go:85] "Starting device plugin registration server" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315333 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315346 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315451 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315540 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315549 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.315747 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-154.ec2.internal" not found Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.316084 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 09:30:47.332321 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.316123 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-154.ec2.internal\" not found" Apr 23 09:30:47.350768 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.350720 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 09:30:47.352015 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.351993 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 09:30:47.352110 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.352036 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 09:30:47.352110 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.352062 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 09:30:47.352110 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.352071 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 09:30:47.352237 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:47.352112 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 09:30:47.355230 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.355207 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:47.416281 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.416198 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:47.420810 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.420786 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:47.420921 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.420818 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:47.420921 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.420831 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:47.420921 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.420855 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.431384 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.431355 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.452679 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.452642 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal"] Apr 23 09:30:47.457186 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.457145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.457310 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.457149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.483027 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.483000 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.487836 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.487817 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.494362 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.494329 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 09:30:47.500726 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.500704 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 09:30:47.543041 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.543012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c6d06d93aa42ede6e1ca1a00d0b49d7a-config\") pod \"kube-apiserver-proxy-ip-10-0-129-154.ec2.internal\" (UID: \"c6d06d93aa42ede6e1ca1a00d0b49d7a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.543124 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.543047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8509a7e7d73752d8ab50ea562ad2fb3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal\" (UID: \"8509a7e7d73752d8ab50ea562ad2fb3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.543124 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.543076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8509a7e7d73752d8ab50ea562ad2fb3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal\" (UID: \"8509a7e7d73752d8ab50ea562ad2fb3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.644255 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.644217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c6d06d93aa42ede6e1ca1a00d0b49d7a-config\") pod \"kube-apiserver-proxy-ip-10-0-129-154.ec2.internal\" (UID: \"c6d06d93aa42ede6e1ca1a00d0b49d7a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.644255 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.644254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8509a7e7d73752d8ab50ea562ad2fb3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal\" (UID: \"8509a7e7d73752d8ab50ea562ad2fb3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.644402 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.644282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8509a7e7d73752d8ab50ea562ad2fb3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal\" (UID: \"8509a7e7d73752d8ab50ea562ad2fb3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.644402 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.644347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c6d06d93aa42ede6e1ca1a00d0b49d7a-config\") pod \"kube-apiserver-proxy-ip-10-0-129-154.ec2.internal\" (UID: \"c6d06d93aa42ede6e1ca1a00d0b49d7a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.644402 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.644350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8509a7e7d73752d8ab50ea562ad2fb3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal\" (UID: \"8509a7e7d73752d8ab50ea562ad2fb3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.644495 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.644405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8509a7e7d73752d8ab50ea562ad2fb3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal\" (UID: \"8509a7e7d73752d8ab50ea562ad2fb3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.797997 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.797908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" Apr 23 09:30:47.802622 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:47.802602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" Apr 23 09:30:48.136086 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.135991 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 09:30:48.136777 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.136231 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:48.136777 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.136231 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:48.136777 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.136236 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:48.215198 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.215150 2575 apiserver.go:52] "Watching apiserver" Apr 23 09:30:48.222930 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.222886 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 09:25:47 +0000 UTC" deadline="2027-10-14 08:56:04.48564143 +0000 UTC" Apr 23 09:30:48.222930 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.222929 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12935h25m16.262718593s" Apr 23 09:30:48.223121 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.223103 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 09:30:48.224260 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.224238 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qbs24","openshift-multus/network-metrics-daemon-nfwtj","openshift-network-diagnostics/network-check-target-t5mzg","kube-system/konnectivity-agent-p669t","kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal","openshift-multus/multus-additional-cni-plugins-mvdjb","openshift-multus/multus-rbs48","openshift-network-operator/iptables-alerter-96h56","openshift-ovn-kubernetes/ovnkube-node-x7lqz","openshift-cluster-node-tuning-operator/tuned-xkbkt"] Apr 23 09:30:48.227286 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.227269 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.229964 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.229946 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 09:30:48.230045 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.229954 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-57kc9\"" Apr 23 09:30:48.230153 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.230141 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 09:30:48.230228 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.230212 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 09:30:48.231445 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.231428 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.231538 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.231494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:48.231538 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.231504 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:30:48.231633 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.231544 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:30:48.234256 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.234240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.236474 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.236452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.236601 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.236583 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 09:30:48.236692 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.236583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-g65s2\"" Apr 23 09:30:48.236901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.236885 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 09:30:48.238649 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.238629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.238753 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.238737 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 09:30:48.238807 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.238757 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 09:30:48.238873 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.238718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kwc2k\"" Apr 23 09:30:48.239018 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.239001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 09:30:48.240453 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.240432 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 09:30:48.240776 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.240710 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.241109 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.241090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 09:30:48.241261 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.241222 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fcwn7\"" Apr 23 09:30:48.241473 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.241452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 09:30:48.242049 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.242029 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 09:30:48.242127 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.242107 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 09:30:48.242171 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.242109 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 09:30:48.242872 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.242855 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6tbpg\"" Apr 23 09:30:48.242948 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.242870 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 09:30:48.243660 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.243643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.245808 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.245787 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 09:30:48.245931 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.245913 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:30:48.245977 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.245917 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fqbwj\"" Apr 23 09:30:48.246061 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246032 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 09:30:48.246108 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.246197 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-sys-fs\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.246259 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-socket-dir-parent\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.246259 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52kj\" (UniqueName: \"kubernetes.io/projected/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-kube-api-access-l52kj\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.246354 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bea3c26f-67bd-4418-8a5c-830cf2936a15-host-slash\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.246806 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.246901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4922b0b7-cbb6-414b-9c1f-b71799a538cf-konnectivity-ca\") pod \"konnectivity-agent-p669t\" (UID: \"4922b0b7-cbb6-414b-9c1f-b71799a538cf\") " pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.246954 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-etc-selinux\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.246954 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-cni-multus\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247046 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.246982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.247095 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f30d2db-f0de-4938-9291-99d089bb41d8-serviceca\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.247143 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-system-cni-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247298 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-hostroot\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247358 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-daemon-config\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247413 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:48.247460 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.247460 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bea3c26f-67bd-4418-8a5c-830cf2936a15-iptables-alerter-script\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.247550 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhr4\" (UniqueName: \"kubernetes.io/projected/de2aeb89-80c4-49f3-bb15-19cbda495e58-kube-api-access-bhhr4\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.247550 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-cni-bin\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247637 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-os-release\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.247637 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.247637 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f30d2db-f0de-4938-9291-99d089bb41d8-host\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.247761 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-cnibin\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247761 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-etc-kubernetes\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247761 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.247891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-socket-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.247891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-device-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.247891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-cni-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-kubelet\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.247891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-multus-certs\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248111 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54zrz\" (UniqueName: \"kubernetes.io/projected/bea3c26f-67bd-4418-8a5c-830cf2936a15-kube-api-access-54zrz\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.248111 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84t5x\" (UniqueName: \"kubernetes.io/projected/9675e92f-c255-4d0a-a137-2fb828720d4d-kube-api-access-84t5x\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.248111 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.247975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4922b0b7-cbb6-414b-9c1f-b71799a538cf-agent-certs\") pod \"konnectivity-agent-p669t\" (UID: \"4922b0b7-cbb6-414b-9c1f-b71799a538cf\") " pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.248111 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-registration-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.248111 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-os-release\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68rdk\" (UniqueName: \"kubernetes.io/projected/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-kube-api-access-68rdk\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-system-cni-dir\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbqjp\" (UniqueName: \"kubernetes.io/projected/7f30d2db-f0de-4938-9291-99d089bb41d8-kube-api-access-dbqjp\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-cni-binary-copy\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-k8s-cni-cncf-io\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-netns\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248695 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-conf-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.248695 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cnibin\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.248695 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.248419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.249761 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.249732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 09:30:48.249761 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.249750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 09:30:48.249907 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.249853 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dpzbh\"" Apr 23 09:30:48.250217 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.250197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 09:30:48.250320 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.250226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 09:30:48.250723 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.250698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 09:30:48.250874 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.250858 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 09:30:48.250938 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.250918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:30:48.251465 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.251449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qxzzp\"" Apr 23 09:30:48.251524 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.251483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 09:30:48.254818 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.254801 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 09:30:48.278918 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.278887 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7f4dd" Apr 23 09:30:48.283969 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.283953 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7f4dd" Apr 23 09:30:48.342424 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.342402 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 09:30:48.348759 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovn-node-metrics-cert\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.348840 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84t5x\" (UniqueName: \"kubernetes.io/projected/9675e92f-c255-4d0a-a137-2fb828720d4d-kube-api-access-84t5x\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.348876 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-system-cni-dir\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.348908 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysconfig\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.348947 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysctl-d\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.348992 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.348992 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.348985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-system-cni-dir\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.349079 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-log-socket\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.349079 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.349171 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-k8s-cni-cncf-io\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349171 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-netns\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349171 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-conf-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-k8s-cni-cncf-io\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cnibin\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-netns\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-conf-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovnkube-script-lib\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cnibin\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.349322 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwb6g\" (UniqueName: \"kubernetes.io/projected/c2afc719-c33e-48f6-bacc-09ccb439d0fb-kube-api-access-mwb6g\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-sys-fs\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bea3c26f-67bd-4418-8a5c-830cf2936a15-host-slash\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-node-log\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-sys-fs\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bea3c26f-67bd-4418-8a5c-830cf2936a15-host-slash\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4922b0b7-cbb6-414b-9c1f-b71799a538cf-konnectivity-ca\") pod \"konnectivity-agent-p669t\" (UID: \"4922b0b7-cbb6-414b-9c1f-b71799a538cf\") " pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-etc-selinux\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-cni-multus\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-slash\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.349625 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.349611 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-system-cni-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bea3c26f-67bd-4418-8a5c-830cf2936a15-iptables-alerter-script\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.349710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:30:48.849662057 +0000 UTC m=+2.113049218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-run\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-etc-selinux\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-systemd-units\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-cni-bin\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-cni-multus\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-os-release\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-system-cni-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f30d2db-f0de-4938-9291-99d089bb41d8-host\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-etc-kubernetes\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f30d2db-f0de-4938-9291-99d089bb41d8-host\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-var-lib-kubelet\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.349967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-host\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-systemd\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.350120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-var-lib-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-socket-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4922b0b7-cbb6-414b-9c1f-b71799a538cf-konnectivity-ca\") pod \"konnectivity-agent-p669t\" (UID: \"4922b0b7-cbb6-414b-9c1f-b71799a538cf\") " pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-device-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-cni-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-kubelet\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-cni-bin\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54zrz\" (UniqueName: \"kubernetes.io/projected/bea3c26f-67bd-4418-8a5c-830cf2936a15-kube-api-access-54zrz\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-modprobe-d\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-etc-kubernetes\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bea3c26f-67bd-4418-8a5c-830cf2936a15-iptables-alerter-script\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-os-release\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-cni-dir\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-cni-bin\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-lib-modules\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-var-lib-kubelet\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.350879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4922b0b7-cbb6-414b-9c1f-b71799a538cf-agent-certs\") pod \"konnectivity-agent-p669t\" (UID: \"4922b0b7-cbb6-414b-9c1f-b71799a538cf\") " pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-registration-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-device-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-registration-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-os-release\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68rdk\" (UniqueName: \"kubernetes.io/projected/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-kube-api-access-68rdk\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de2aeb89-80c4-49f3-bb15-19cbda495e58-socket-dir\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-run-netns\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.350989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-os-release\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-env-overrides\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbqjp\" (UniqueName: \"kubernetes.io/projected/7f30d2db-f0de-4938-9291-99d089bb41d8-kube-api-access-dbqjp\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-cni-binary-copy\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351099 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l52kj\" (UniqueName: \"kubernetes.io/projected/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-kube-api-access-l52kj\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-ovn\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-socket-dir-parent\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.351673 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-sys\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xgqb\" (UniqueName: \"kubernetes.io/projected/4919ce05-148e-4367-8312-f7597a344990-kube-api-access-2xgqb\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f30d2db-f0de-4938-9291-99d089bb41d8-serviceca\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-hostroot\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-daemon-config\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4919ce05-148e-4367-8312-f7597a344990-etc-tuned\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4919ce05-148e-4367-8312-f7597a344990-tmp\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhr4\" (UniqueName: \"kubernetes.io/projected/de2aeb89-80c4-49f3-bb15-19cbda495e58-kube-api-access-bhhr4\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-kubernetes\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysctl-conf\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352494 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-cni-binary-copy\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.351928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-kubelet\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovnkube-config\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-cnibin\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-cni-netd\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-hostroot\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f30d2db-f0de-4938-9291-99d089bb41d8-serviceca\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-cnibin\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-systemd\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-multus-certs\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.353469 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-etc-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.353469 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-socket-dir-parent\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.353469 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.352944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.353716 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.353675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-host-run-multus-certs\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.355066 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.354698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-multus-daemon-config\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.355066 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.354761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.355066 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.354751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.356878 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.356857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4922b0b7-cbb6-414b-9c1f-b71799a538cf-agent-certs\") pod \"konnectivity-agent-p669t\" (UID: \"4922b0b7-cbb6-414b-9c1f-b71799a538cf\") " pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.358686 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.358625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84t5x\" (UniqueName: \"kubernetes.io/projected/9675e92f-c255-4d0a-a137-2fb828720d4d-kube-api-access-84t5x\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.359430 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.359407 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:48.359530 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.359436 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:48.359530 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.359450 2575 projected.go:194] Error preparing data for projected volume kube-api-access-cwfkn for pod openshift-network-diagnostics/network-check-target-t5mzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:48.359651 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.359551 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn podName:83e2352e-7188-4529-a79a-11d59e36b30b nodeName:}" failed. No retries permitted until 2026-04-23 09:30:48.85953183 +0000 UTC m=+2.122918958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cwfkn" (UniqueName: "kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn") pod "network-check-target-t5mzg" (UID: "83e2352e-7188-4529-a79a-11d59e36b30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:48.360310 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.360286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68rdk\" (UniqueName: \"kubernetes.io/projected/7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc-kube-api-access-68rdk\") pod \"multus-rbs48\" (UID: \"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc\") " pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.360660 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.360636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54zrz\" (UniqueName: \"kubernetes.io/projected/bea3c26f-67bd-4418-8a5c-830cf2936a15-kube-api-access-54zrz\") pod \"iptables-alerter-96h56\" (UID: \"bea3c26f-67bd-4418-8a5c-830cf2936a15\") " pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.361016 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.360998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbqjp\" (UniqueName: \"kubernetes.io/projected/7f30d2db-f0de-4938-9291-99d089bb41d8-kube-api-access-dbqjp\") pod \"node-ca-qbs24\" (UID: \"7f30d2db-f0de-4938-9291-99d089bb41d8\") " pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.361215 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.361167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52kj\" (UniqueName: \"kubernetes.io/projected/d3bd2bf5-e581-45d4-b978-acec1bd86ea3-kube-api-access-l52kj\") pod \"multus-additional-cni-plugins-mvdjb\" (UID: \"d3bd2bf5-e581-45d4-b978-acec1bd86ea3\") " pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.361860 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.361841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhr4\" (UniqueName: \"kubernetes.io/projected/de2aeb89-80c4-49f3-bb15-19cbda495e58-kube-api-access-bhhr4\") pod \"aws-ebs-csi-driver-node-qh6hr\" (UID: \"de2aeb89-80c4-49f3-bb15-19cbda495e58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.433661 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.433607 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8509a7e7d73752d8ab50ea562ad2fb3f.slice/crio-c29909972327b34ddeafde3ef5e0dbb73f95e0a777b6fc5269447ead2c5fc152 WatchSource:0}: Error finding container c29909972327b34ddeafde3ef5e0dbb73f95e0a777b6fc5269447ead2c5fc152: Status 404 returned error can't find the container with id c29909972327b34ddeafde3ef5e0dbb73f95e0a777b6fc5269447ead2c5fc152 Apr 23 09:30:48.434143 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.434121 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d06d93aa42ede6e1ca1a00d0b49d7a.slice/crio-d763eaadaf678ea1aabd3f02f04b5b564bf5e3adeb8e901dae82b26193a9d13a WatchSource:0}: Error finding container d763eaadaf678ea1aabd3f02f04b5b564bf5e3adeb8e901dae82b26193a9d13a: Status 404 returned error can't find the container with id d763eaadaf678ea1aabd3f02f04b5b564bf5e3adeb8e901dae82b26193a9d13a Apr 23 09:30:48.439762 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.439740 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:30:48.453590 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-run\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453590 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-systemd-units\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-var-lib-kubelet\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-host\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-systemd\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-var-lib-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-run\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-systemd-units\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-host\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-systemd\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-var-lib-kubelet\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-modprobe-d\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.453809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-cni-bin\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-lib-modules\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-run-netns\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-var-lib-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-modprobe-d\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-cni-bin\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-env-overrides\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-run-netns\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-lib-modules\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-ovn\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.453996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-sys\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xgqb\" (UniqueName: \"kubernetes.io/projected/4919ce05-148e-4367-8312-f7597a344990-kube-api-access-2xgqb\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-ovn\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4919ce05-148e-4367-8312-f7597a344990-etc-tuned\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-sys\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.454366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4919ce05-148e-4367-8312-f7597a344990-tmp\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-kubernetes\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysctl-conf\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-kubelet\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-kubelet\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovnkube-config\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-env-overrides\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-cni-netd\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-systemd\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-etc-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysctl-conf\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-systemd\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-etc-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-cni-netd\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-kubernetes\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovn-node-metrics-cert\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455100 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysconfig\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysctl-d\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-log-socket\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysconfig\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovnkube-script-lib\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-run-openvswitch\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwb6g\" (UniqueName: \"kubernetes.io/projected/c2afc719-c33e-48f6-bacc-09ccb439d0fb-kube-api-access-mwb6g\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-node-log\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-slash\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovnkube-config\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-host-slash\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-log-socket\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2afc719-c33e-48f6-bacc-09ccb439d0fb-node-log\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.454974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4919ce05-148e-4367-8312-f7597a344990-etc-sysctl-d\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.455707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.455362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovnkube-script-lib\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.456419 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.456401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4919ce05-148e-4367-8312-f7597a344990-etc-tuned\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.456686 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.456666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4919ce05-148e-4367-8312-f7597a344990-tmp\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.456816 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.456799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2afc719-c33e-48f6-bacc-09ccb439d0fb-ovn-node-metrics-cert\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.461984 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.461958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwb6g\" (UniqueName: \"kubernetes.io/projected/c2afc719-c33e-48f6-bacc-09ccb439d0fb-kube-api-access-mwb6g\") pod \"ovnkube-node-x7lqz\" (UID: \"c2afc719-c33e-48f6-bacc-09ccb439d0fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.462069 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.462047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xgqb\" (UniqueName: \"kubernetes.io/projected/4919ce05-148e-4367-8312-f7597a344990-kube-api-access-2xgqb\") pod \"tuned-xkbkt\" (UID: \"4919ce05-148e-4367-8312-f7597a344990\") " pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.552054 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.552022 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qbs24" Apr 23 09:30:48.558053 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.558025 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f30d2db_f0de_4938_9291_99d089bb41d8.slice/crio-1f6851ccc0065b1eebebcb94e4a836ad240b689bd66c2cfa8477cd4f590adb0d WatchSource:0}: Error finding container 1f6851ccc0065b1eebebcb94e4a836ad240b689bd66c2cfa8477cd4f590adb0d: Status 404 returned error can't find the container with id 1f6851ccc0065b1eebebcb94e4a836ad240b689bd66c2cfa8477cd4f590adb0d Apr 23 09:30:48.563746 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.563726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p669t" Apr 23 09:30:48.569751 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.569728 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4922b0b7_cbb6_414b_9c1f_b71799a538cf.slice/crio-5aec4343baff764832f92cc313721a0be26136ae862bd48b243150885d3cb197 WatchSource:0}: Error finding container 5aec4343baff764832f92cc313721a0be26136ae862bd48b243150885d3cb197: Status 404 returned error can't find the container with id 5aec4343baff764832f92cc313721a0be26136ae862bd48b243150885d3cb197 Apr 23 09:30:48.591712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.591684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" Apr 23 09:30:48.597664 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.597626 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2aeb89_80c4_49f3_bb15_19cbda495e58.slice/crio-602ef3f4d615e73455b08168468062e37d3723c19f10b2e60000fa092379fb0f WatchSource:0}: Error finding container 602ef3f4d615e73455b08168468062e37d3723c19f10b2e60000fa092379fb0f: Status 404 returned error can't find the container with id 602ef3f4d615e73455b08168468062e37d3723c19f10b2e60000fa092379fb0f Apr 23 09:30:48.606967 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.606942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" Apr 23 09:30:48.613572 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.613548 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bd2bf5_e581_45d4_b978_acec1bd86ea3.slice/crio-9b204cf2a1371bc6c8c0b2b43b8e28d03ab5d4364f4508b36ea3b1f9af0051c3 WatchSource:0}: Error finding container 9b204cf2a1371bc6c8c0b2b43b8e28d03ab5d4364f4508b36ea3b1f9af0051c3: Status 404 returned error can't find the container with id 9b204cf2a1371bc6c8c0b2b43b8e28d03ab5d4364f4508b36ea3b1f9af0051c3 Apr 23 09:30:48.617458 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.617442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rbs48" Apr 23 09:30:48.622972 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.622948 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b5dd615_d19f_43a8_9c21_49d3b2cb8bcc.slice/crio-90e6d47a2f8fe0e8edeff92b93b165fabce8150ab887c6b08a5de7d173498d58 WatchSource:0}: Error finding container 90e6d47a2f8fe0e8edeff92b93b165fabce8150ab887c6b08a5de7d173498d58: Status 404 returned error can't find the container with id 90e6d47a2f8fe0e8edeff92b93b165fabce8150ab887c6b08a5de7d173498d58 Apr 23 09:30:48.633071 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.633050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-96h56" Apr 23 09:30:48.639102 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.639070 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea3c26f_67bd_4418_8a5c_830cf2936a15.slice/crio-6bf981bcb23ef60569ef25ac7d04a810e5a0abcce00f4e12ce3a88abef50d4f8 WatchSource:0}: Error finding container 6bf981bcb23ef60569ef25ac7d04a810e5a0abcce00f4e12ce3a88abef50d4f8: Status 404 returned error can't find the container with id 6bf981bcb23ef60569ef25ac7d04a810e5a0abcce00f4e12ce3a88abef50d4f8 Apr 23 09:30:48.645108 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.645091 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:30:48.650972 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.650953 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" Apr 23 09:30:48.651301 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.651281 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2afc719_c33e_48f6_bacc_09ccb439d0fb.slice/crio-d641ff9ad3fc41bb42bd8b125bad7f90654a8180cebeee325c4a3bda8ea1d99b WatchSource:0}: Error finding container d641ff9ad3fc41bb42bd8b125bad7f90654a8180cebeee325c4a3bda8ea1d99b: Status 404 returned error can't find the container with id d641ff9ad3fc41bb42bd8b125bad7f90654a8180cebeee325c4a3bda8ea1d99b Apr 23 09:30:48.656810 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:30:48.656779 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4919ce05_148e_4367_8312_f7597a344990.slice/crio-a262488aba4c3153373d22cb90d9b5e3d55e7a575fa3b462939373f35b2bf8e9 WatchSource:0}: Error finding container a262488aba4c3153373d22cb90d9b5e3d55e7a575fa3b462939373f35b2bf8e9: Status 404 returned error can't find the container with id a262488aba4c3153373d22cb90d9b5e3d55e7a575fa3b462939373f35b2bf8e9 Apr 23 09:30:48.859139 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.858838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:48.859139 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.859020 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:48.859338 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.859189 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:30:49.859153213 +0000 UTC m=+3.122540342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:48.960286 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:48.960246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:48.960485 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.960456 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:48.960485 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.960475 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:48.960587 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.960488 2575 projected.go:194] Error preparing data for projected volume kube-api-access-cwfkn for pod openshift-network-diagnostics/network-check-target-t5mzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:48.960587 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:48.960555 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn podName:83e2352e-7188-4529-a79a-11d59e36b30b nodeName:}" failed. No retries permitted until 2026-04-23 09:30:49.960537049 +0000 UTC m=+3.223924178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cwfkn" (UniqueName: "kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn") pod "network-check-target-t5mzg" (UID: "83e2352e-7188-4529-a79a-11d59e36b30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:49.176446 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.176368 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:49.284834 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.284747 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 09:25:48 +0000 UTC" deadline="2027-12-21 11:58:54.136352022 +0000 UTC" Apr 23 09:30:49.284834 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.284777 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14570h28m4.851579377s" Apr 23 09:30:49.353875 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.353343 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:49.353875 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.353499 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:30:49.360150 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.360051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" event={"ID":"4919ce05-148e-4367-8312-f7597a344990","Type":"ContainerStarted","Data":"a262488aba4c3153373d22cb90d9b5e3d55e7a575fa3b462939373f35b2bf8e9"} Apr 23 09:30:49.362950 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.362913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"d641ff9ad3fc41bb42bd8b125bad7f90654a8180cebeee325c4a3bda8ea1d99b"} Apr 23 09:30:49.364832 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.364798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-96h56" event={"ID":"bea3c26f-67bd-4418-8a5c-830cf2936a15","Type":"ContainerStarted","Data":"6bf981bcb23ef60569ef25ac7d04a810e5a0abcce00f4e12ce3a88abef50d4f8"} Apr 23 09:30:49.373757 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.373648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerStarted","Data":"9b204cf2a1371bc6c8c0b2b43b8e28d03ab5d4364f4508b36ea3b1f9af0051c3"} Apr 23 09:30:49.378068 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.377984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" event={"ID":"de2aeb89-80c4-49f3-bb15-19cbda495e58","Type":"ContainerStarted","Data":"602ef3f4d615e73455b08168468062e37d3723c19f10b2e60000fa092379fb0f"} Apr 23 09:30:49.383065 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.382993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p669t" event={"ID":"4922b0b7-cbb6-414b-9c1f-b71799a538cf","Type":"ContainerStarted","Data":"5aec4343baff764832f92cc313721a0be26136ae862bd48b243150885d3cb197"} Apr 23 09:30:49.386460 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.386394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qbs24" event={"ID":"7f30d2db-f0de-4938-9291-99d089bb41d8","Type":"ContainerStarted","Data":"1f6851ccc0065b1eebebcb94e4a836ad240b689bd66c2cfa8477cd4f590adb0d"} Apr 23 09:30:49.390373 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.390345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rbs48" event={"ID":"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc","Type":"ContainerStarted","Data":"90e6d47a2f8fe0e8edeff92b93b165fabce8150ab887c6b08a5de7d173498d58"} Apr 23 09:30:49.397199 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.397097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" event={"ID":"c6d06d93aa42ede6e1ca1a00d0b49d7a","Type":"ContainerStarted","Data":"d763eaadaf678ea1aabd3f02f04b5b564bf5e3adeb8e901dae82b26193a9d13a"} Apr 23 09:30:49.404423 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.404379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" event={"ID":"8509a7e7d73752d8ab50ea562ad2fb3f","Type":"ContainerStarted","Data":"c29909972327b34ddeafde3ef5e0dbb73f95e0a777b6fc5269447ead2c5fc152"} Apr 23 09:30:49.480762 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.480687 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:49.644220 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.644170 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:49.867445 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.867351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:49.867615 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.867551 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:49.867672 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.867615 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:30:51.86759722 +0000 UTC m=+5.130984363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:49.968533 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:49.968485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:49.968880 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.968732 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:49.968880 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.968756 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:49.968880 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.968769 2575 projected.go:194] Error preparing data for projected volume kube-api-access-cwfkn for pod openshift-network-diagnostics/network-check-target-t5mzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:49.968880 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:49.968833 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn podName:83e2352e-7188-4529-a79a-11d59e36b30b nodeName:}" failed. No retries permitted until 2026-04-23 09:30:51.968814561 +0000 UTC m=+5.232201706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cwfkn" (UniqueName: "kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn") pod "network-check-target-t5mzg" (UID: "83e2352e-7188-4529-a79a-11d59e36b30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:50.286074 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:50.285960 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 09:25:48 +0000 UTC" deadline="2027-09-22 05:12:01.777884486 +0000 UTC" Apr 23 09:30:50.286074 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:50.286001 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12403h41m11.491887387s" Apr 23 09:30:50.352309 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:50.352279 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:50.352500 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:50.352415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:30:51.354909 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:51.354065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:51.354909 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.354235 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:30:51.882707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:51.882670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:51.882886 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.882869 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:51.882955 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.882946 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:30:55.882924948 +0000 UTC m=+9.146312081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:51.984006 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:51.983970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:51.984227 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.984195 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:51.984227 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.984218 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:51.984349 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.984231 2575 projected.go:194] Error preparing data for projected volume kube-api-access-cwfkn for pod openshift-network-diagnostics/network-check-target-t5mzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:51.984349 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:51.984290 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn podName:83e2352e-7188-4529-a79a-11d59e36b30b nodeName:}" failed. No retries permitted until 2026-04-23 09:30:55.984271855 +0000 UTC m=+9.247658989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cwfkn" (UniqueName: "kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn") pod "network-check-target-t5mzg" (UID: "83e2352e-7188-4529-a79a-11d59e36b30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:52.353360 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.353261 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:52.353529 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:52.353412 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:30:52.954727 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.954486 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xvcqj"] Apr 23 09:30:52.957377 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.957349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:52.960823 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.960799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 09:30:52.962002 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.961978 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 09:30:52.962262 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.962245 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-c65vd\"" Apr 23 09:30:52.992045 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.991994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbce7cd3-954b-41fd-b6c9-9f47fee30477-tmp-dir\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:52.992259 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.992072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bbce7cd3-954b-41fd-b6c9-9f47fee30477-hosts-file\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:52.992259 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:52.992123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvs7j\" (UniqueName: \"kubernetes.io/projected/bbce7cd3-954b-41fd-b6c9-9f47fee30477-kube-api-access-mvs7j\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.092991 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.092949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbce7cd3-954b-41fd-b6c9-9f47fee30477-tmp-dir\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.093269 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.093025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bbce7cd3-954b-41fd-b6c9-9f47fee30477-hosts-file\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.093269 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.093075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvs7j\" (UniqueName: \"kubernetes.io/projected/bbce7cd3-954b-41fd-b6c9-9f47fee30477-kube-api-access-mvs7j\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.093269 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.093201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bbce7cd3-954b-41fd-b6c9-9f47fee30477-hosts-file\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.093473 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.093402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbce7cd3-954b-41fd-b6c9-9f47fee30477-tmp-dir\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.104327 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.104253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvs7j\" (UniqueName: \"kubernetes.io/projected/bbce7cd3-954b-41fd-b6c9-9f47fee30477-kube-api-access-mvs7j\") pod \"node-resolver-xvcqj\" (UID: \"bbce7cd3-954b-41fd-b6c9-9f47fee30477\") " pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.269741 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.269627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvcqj" Apr 23 09:30:53.353040 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:53.353005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:53.353233 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:53.353150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:30:54.352988 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:54.352959 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:54.353479 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:54.353080 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:30:55.353451 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:55.353415 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:55.353911 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:55.353561 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:30:55.916280 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:55.915692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:55.916280 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:55.915853 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:55.916280 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:55.915917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:31:03.915896316 +0000 UTC m=+17.179283444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:56.016521 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:56.016458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:56.016694 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:56.016647 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:56.016694 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:56.016668 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:56.016694 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:56.016681 2575 projected.go:194] Error preparing data for projected volume kube-api-access-cwfkn for pod openshift-network-diagnostics/network-check-target-t5mzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:56.016871 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:56.016741 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn podName:83e2352e-7188-4529-a79a-11d59e36b30b nodeName:}" failed. No retries permitted until 2026-04-23 09:31:04.016721758 +0000 UTC m=+17.280108909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cwfkn" (UniqueName: "kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn") pod "network-check-target-t5mzg" (UID: "83e2352e-7188-4529-a79a-11d59e36b30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:56.352949 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:56.352864 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:56.353112 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:56.352999 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:30:57.353466 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:57.353433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:57.353922 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:57.353555 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:30:58.352658 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:58.352580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:30:58.352810 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:58.352703 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:30:59.352920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:30:59.352878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:30:59.353389 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:30:59.353044 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:00.352250 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:00.352216 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:00.352471 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:00.352320 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:01.354982 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:01.354945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:01.355413 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:01.355068 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:02.352754 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:02.352718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:02.352956 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:02.352827 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:03.353222 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:03.353185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:03.353630 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:03.353336 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:03.965680 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:03.965638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:03.965868 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:03.965788 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:31:03.965938 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:03.965923 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:31:19.965850649 +0000 UTC m=+33.229237800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:31:04.066323 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:04.066278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:04.066511 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:04.066469 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:31:04.066511 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:04.066497 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:31:04.066511 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:04.066507 2575 projected.go:194] Error preparing data for projected volume kube-api-access-cwfkn for pod openshift-network-diagnostics/network-check-target-t5mzg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:31:04.066633 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:04.066580 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn podName:83e2352e-7188-4529-a79a-11d59e36b30b nodeName:}" failed. No retries permitted until 2026-04-23 09:31:20.066559619 +0000 UTC m=+33.329946838 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cwfkn" (UniqueName: "kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn") pod "network-check-target-t5mzg" (UID: "83e2352e-7188-4529-a79a-11d59e36b30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:31:04.352445 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:04.352362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:04.352601 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:04.352492 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:05.352537 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:05.352502 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:05.352955 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:05.352641 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:06.352764 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:06.352741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:06.353128 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:06.352870 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:06.436945 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:06.436905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvcqj" event={"ID":"bbce7cd3-954b-41fd-b6c9-9f47fee30477","Type":"ContainerStarted","Data":"950a4aa12ebc9193a633ac4c2d7820d55aebdd5f03bede14189caad401a2dac4"} Apr 23 09:31:07.353543 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.353363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:07.354401 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:07.353602 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:07.439758 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.439650 2575 generic.go:358] "Generic (PLEG): container finished" podID="d3bd2bf5-e581-45d4-b978-acec1bd86ea3" containerID="18491f63d0d335b4cf5a653b3c0bc224c76b1dbef891bf30f1e6960d92936e10" exitCode=0 Apr 23 09:31:07.439758 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.439713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerDied","Data":"18491f63d0d335b4cf5a653b3c0bc224c76b1dbef891bf30f1e6960d92936e10"} Apr 23 09:31:07.441032 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.441001 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" event={"ID":"de2aeb89-80c4-49f3-bb15-19cbda495e58","Type":"ContainerStarted","Data":"c0b557a8912d657bd5761b80c98820cf2405a668a55e5f45a4a9b0923d6cc25c"} Apr 23 09:31:07.442193 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.442155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p669t" event={"ID":"4922b0b7-cbb6-414b-9c1f-b71799a538cf","Type":"ContainerStarted","Data":"a27c799b85626bb6ccf0fdf51563fa2e31bb1d0b9a597dcaaf93048cf2508e3c"} Apr 23 09:31:07.443353 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.443331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qbs24" event={"ID":"7f30d2db-f0de-4938-9291-99d089bb41d8","Type":"ContainerStarted","Data":"172844a0ffb6b26181d6d5971e0a00aea5a63386bb865486317ac8148f37f6b6"} Apr 23 09:31:07.444530 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.444499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rbs48" event={"ID":"7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc","Type":"ContainerStarted","Data":"d788c724c3dbebd9e84d47a911aa090afd9a1166658aa4d38d119d304eb3c913"} Apr 23 09:31:07.445671 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.445650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" event={"ID":"c6d06d93aa42ede6e1ca1a00d0b49d7a","Type":"ContainerStarted","Data":"2d924e0c8c408cefa75fe61d4c73ff7dea9c19885734158f2fb74aeb3cd6a3a6"} Apr 23 09:31:07.446901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.446879 2575 generic.go:358] "Generic (PLEG): container finished" podID="8509a7e7d73752d8ab50ea562ad2fb3f" containerID="17b225cab1502d2421afdb567d74389e27b6beb160093f127fe9b4e64be443aa" exitCode=0 Apr 23 09:31:07.447005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.446936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" event={"ID":"8509a7e7d73752d8ab50ea562ad2fb3f","Type":"ContainerDied","Data":"17b225cab1502d2421afdb567d74389e27b6beb160093f127fe9b4e64be443aa"} Apr 23 09:31:07.448094 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.448067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvcqj" event={"ID":"bbce7cd3-954b-41fd-b6c9-9f47fee30477","Type":"ContainerStarted","Data":"5d27edca0354e787a8fcc3a114a8926ccd7923ca9da984402f83881c94b0b20e"} Apr 23 09:31:07.449331 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.449300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" event={"ID":"4919ce05-148e-4367-8312-f7597a344990","Type":"ContainerStarted","Data":"930d08549003dd47b39bbb93523c1ec1645055709e0031ef7761ad3203829755"} Apr 23 09:31:07.451822 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.451801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"7fdff474ce602873aa56ccd51ed6ebc9942fc545633b4e368af5011b0e404fb2"} Apr 23 09:31:07.451901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.451827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"c3aadd34d581a12ccc39b8c4f68ebe6c09f0a68dd4192aaa36f6e3ce6702c059"} Apr 23 09:31:07.451901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.451835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"baa6f4e7d4aa967975f551897d0527a60d7a1692a17fe7908777f1ffafae3acf"} Apr 23 09:31:07.451901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.451847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"89d6ec7cbe29ca3a0bac7468b418eae8f516d9ae1440dc2628ad289fb4c1f795"} Apr 23 09:31:07.451901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.451858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"026e0480759f65c9882a0814293d66d3c5196f1171b0a51f28bdceb5029fc1a8"} Apr 23 09:31:07.451901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.451866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"69c4b88ae7eadbac2a5a3da6979a1078e5daaf436a8081329f1fb91986b02240"} Apr 23 09:31:07.491637 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.491582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p669t" podStartSLOduration=2.8233799040000003 podStartE2EDuration="20.491567499s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.571210659 +0000 UTC m=+1.834597788" lastFinishedPulling="2026-04-23 09:31:06.239398247 +0000 UTC m=+19.502785383" observedRunningTime="2026-04-23 09:31:07.491562059 +0000 UTC m=+20.754949208" watchObservedRunningTime="2026-04-23 09:31:07.491567499 +0000 UTC m=+20.754954648" Apr 23 09:31:07.510049 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.509997 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rbs48" podStartSLOduration=2.809521051 podStartE2EDuration="20.509982043s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.624444205 +0000 UTC m=+1.887831334" lastFinishedPulling="2026-04-23 09:31:06.324905198 +0000 UTC m=+19.588292326" observedRunningTime="2026-04-23 09:31:07.509816556 +0000 UTC m=+20.773203718" watchObservedRunningTime="2026-04-23 09:31:07.509982043 +0000 UTC m=+20.773369203" Apr 23 09:31:07.524878 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.524807 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xvcqj" podStartSLOduration=15.524789585 podStartE2EDuration="15.524789585s" podCreationTimestamp="2026-04-23 09:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:07.524479971 +0000 UTC m=+20.787867118" watchObservedRunningTime="2026-04-23 09:31:07.524789585 +0000 UTC m=+20.788176736" Apr 23 09:31:07.543874 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.543831 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-154.ec2.internal" podStartSLOduration=20.543818621 podStartE2EDuration="20.543818621s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:07.543449952 +0000 UTC m=+20.806837101" watchObservedRunningTime="2026-04-23 09:31:07.543818621 +0000 UTC m=+20.807205771" Apr 23 09:31:07.584399 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.584349 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xkbkt" podStartSLOduration=2.932965986 podStartE2EDuration="20.584334819s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.658824004 +0000 UTC m=+1.922211132" lastFinishedPulling="2026-04-23 09:31:06.310192821 +0000 UTC m=+19.573579965" observedRunningTime="2026-04-23 09:31:07.583763399 +0000 UTC m=+20.847150546" watchObservedRunningTime="2026-04-23 09:31:07.584334819 +0000 UTC m=+20.847722001" Apr 23 09:31:07.603404 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:07.603348 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qbs24" podStartSLOduration=2.852177445 podStartE2EDuration="20.603331389s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.559516993 +0000 UTC m=+1.822904120" lastFinishedPulling="2026-04-23 09:31:06.310670922 +0000 UTC m=+19.574058064" observedRunningTime="2026-04-23 09:31:07.603051426 +0000 UTC m=+20.866438576" watchObservedRunningTime="2026-04-23 09:31:07.603331389 +0000 UTC m=+20.866718578" Apr 23 09:31:08.138151 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.138122 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 09:31:08.255862 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.255826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p669t" Apr 23 09:31:08.256516 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.256494 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p669t" Apr 23 09:31:08.326566 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.326429 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T09:31:08.1381477Z","UUID":"379c607e-db35-4a59-972e-ac686a80ee31","Handler":null,"Name":"","Endpoint":""} Apr 23 09:31:08.329333 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.329308 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 09:31:08.329333 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.329337 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 09:31:08.352726 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.352688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:08.352915 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:08.352813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:08.455431 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.455393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" event={"ID":"8509a7e7d73752d8ab50ea562ad2fb3f","Type":"ContainerStarted","Data":"d9343658ddbd1967368505ccdcaf44e2f1d037f5895c12852d4acbd8d24b678b"} Apr 23 09:31:08.456852 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.456821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-96h56" event={"ID":"bea3c26f-67bd-4418-8a5c-830cf2936a15","Type":"ContainerStarted","Data":"ca51478eac0fb55c127ff815487248eb4e288e84190a7f1d09706a77155ab3a3"} Apr 23 09:31:08.458589 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.458561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" event={"ID":"de2aeb89-80c4-49f3-bb15-19cbda495e58","Type":"ContainerStarted","Data":"d51b0cc347ebb279a1a0fd3dd4fc96cb02f3402c3a3eb33e338c2cb4a460f6de"} Apr 23 09:31:08.459491 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.459470 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p669t" Apr 23 09:31:08.459937 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.459915 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p669t" Apr 23 09:31:08.470055 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.469980 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-154.ec2.internal" podStartSLOduration=21.469964626 podStartE2EDuration="21.469964626s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:08.469916053 +0000 UTC m=+21.733303203" watchObservedRunningTime="2026-04-23 09:31:08.469964626 +0000 UTC m=+21.733351776" Apr 23 09:31:08.496919 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:08.496854 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-96h56" podStartSLOduration=3.898036111 podStartE2EDuration="21.496834106s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.640586213 +0000 UTC m=+1.903973340" lastFinishedPulling="2026-04-23 09:31:06.239384193 +0000 UTC m=+19.502771335" observedRunningTime="2026-04-23 09:31:08.496500793 +0000 UTC m=+21.759887945" watchObservedRunningTime="2026-04-23 09:31:08.496834106 +0000 UTC m=+21.760221257" Apr 23 09:31:09.352528 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:09.352438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:09.352727 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:09.352582 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:09.464027 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:09.463979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"d06d3dfe542f0c1b76e9020a92e361a2f31a9921abba1729f6e8c9be22674ec9"} Apr 23 09:31:10.352843 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:10.352663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:10.353046 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:10.352937 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:10.467627 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:10.467578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" event={"ID":"de2aeb89-80c4-49f3-bb15-19cbda495e58","Type":"ContainerStarted","Data":"6b83e125bf43a5dbdc2b9c7d7677e07d0c204348d07109486c769a08e5c50bfe"} Apr 23 09:31:10.486129 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:10.486071 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qh6hr" podStartSLOduration=2.3961991879999998 podStartE2EDuration="23.486053622s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.59952927 +0000 UTC m=+1.862916399" lastFinishedPulling="2026-04-23 09:31:09.689383499 +0000 UTC m=+22.952770833" observedRunningTime="2026-04-23 09:31:10.485822964 +0000 UTC m=+23.749210116" watchObservedRunningTime="2026-04-23 09:31:10.486053622 +0000 UTC m=+23.749440774" Apr 23 09:31:11.352596 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:11.352551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:11.352767 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:11.352687 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:12.353047 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.352883 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:12.353730 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:12.353113 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:12.473271 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.473235 2575 generic.go:358] "Generic (PLEG): container finished" podID="d3bd2bf5-e581-45d4-b978-acec1bd86ea3" containerID="67928bc083035c863cf697773e11bbd48760e0d7c5ef188ec0230c93aeb9aa42" exitCode=0 Apr 23 09:31:12.473461 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.473328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerDied","Data":"67928bc083035c863cf697773e11bbd48760e0d7c5ef188ec0230c93aeb9aa42"} Apr 23 09:31:12.476724 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.476701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" event={"ID":"c2afc719-c33e-48f6-bacc-09ccb439d0fb","Type":"ContainerStarted","Data":"1566e60bab3533c2ab3fb90b4dc9fd1ad19c046014eabec50d0bfa1bb0d7e33f"} Apr 23 09:31:12.477011 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.476992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:31:12.491991 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.491954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:31:12.519259 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:12.519208 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" podStartSLOduration=7.682201837 podStartE2EDuration="25.519193102s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.653816918 +0000 UTC m=+1.917204045" lastFinishedPulling="2026-04-23 09:31:06.490808177 +0000 UTC m=+19.754195310" observedRunningTime="2026-04-23 09:31:12.517552589 +0000 UTC m=+25.780939738" watchObservedRunningTime="2026-04-23 09:31:12.519193102 +0000 UTC m=+25.782580248" Apr 23 09:31:13.353247 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.353218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:13.353709 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:13.353337 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:13.480921 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.480885 2575 generic.go:358] "Generic (PLEG): container finished" podID="d3bd2bf5-e581-45d4-b978-acec1bd86ea3" containerID="09ce19ac295d87340d70270f638677c0be1266b07a38022c3747ee2f0afc598c" exitCode=0 Apr 23 09:31:13.481090 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.480984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerDied","Data":"09ce19ac295d87340d70270f638677c0be1266b07a38022c3747ee2f0afc598c"} Apr 23 09:31:13.481090 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.481052 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 09:31:13.481460 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.481436 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:31:13.497592 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.497557 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:31:13.617275 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.617238 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t5mzg"] Apr 23 09:31:13.617451 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.617396 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:13.617527 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:13.617505 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:13.620071 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.620044 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nfwtj"] Apr 23 09:31:13.620210 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:13.620149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:13.620257 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:13.620241 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:14.484301 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:14.484099 2575 generic.go:358] "Generic (PLEG): container finished" podID="d3bd2bf5-e581-45d4-b978-acec1bd86ea3" containerID="a861a7e20eee5402866d8ee7b036fe6be3474e64f383461d62a7eccc22227fa4" exitCode=0 Apr 23 09:31:14.484301 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:14.484200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerDied","Data":"a861a7e20eee5402866d8ee7b036fe6be3474e64f383461d62a7eccc22227fa4"} Apr 23 09:31:14.484693 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:14.484458 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 09:31:15.204573 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:15.204538 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:31:15.352948 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:15.352915 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:15.353103 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:15.352923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:15.353103 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:15.353063 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:15.353103 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:15.353088 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:16.500777 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:16.500714 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" podUID="c2afc719-c33e-48f6-bacc-09ccb439d0fb" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 09:31:17.353537 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:17.353500 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:17.353693 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:17.353612 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t5mzg" podUID="83e2352e-7188-4529-a79a-11d59e36b30b" Apr 23 09:31:17.353693 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:17.353646 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:17.353795 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:17.353735 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nfwtj" podUID="9675e92f-c255-4d0a-a137-2fb828720d4d" Apr 23 09:31:19.053411 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.053379 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-154.ec2.internal" event="NodeReady" Apr 23 09:31:19.053976 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.053539 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 09:31:19.093467 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.093427 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w7jwq"] Apr 23 09:31:19.130249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.130205 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tlwj2"] Apr 23 09:31:19.130424 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.130375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.133437 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.133412 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 09:31:19.133568 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.133448 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 09:31:19.133568 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.133488 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvkwp\"" Apr 23 09:31:19.143898 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.143866 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w7jwq"] Apr 23 09:31:19.144046 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.143904 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tlwj2"] Apr 23 09:31:19.144046 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.143947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.147409 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.147348 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 09:31:19.147409 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.147373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 09:31:19.147602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.147503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mw7t9\"" Apr 23 09:31:19.147602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.147513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 09:31:19.277745 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.277705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxgd7\" (UniqueName: \"kubernetes.io/projected/208a4652-fd62-4c8b-b1b9-542601fb566c-kube-api-access-sxgd7\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.277936 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.277814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/208a4652-fd62-4c8b-b1b9-542601fb566c-tmp-dir\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.277936 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.277883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttt7\" (UniqueName: \"kubernetes.io/projected/198951b6-14d0-4d27-82e3-20e88b58ddc3-kube-api-access-2ttt7\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.277936 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.277908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/208a4652-fd62-4c8b-b1b9-542601fb566c-config-volume\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.278080 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.277952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.278080 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.277993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.353328 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.353285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:19.353946 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.353553 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:19.356797 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.356496 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 09:31:19.356797 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.356539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwhsm\"" Apr 23 09:31:19.356797 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.356570 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 09:31:19.356797 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.356600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gg4dj\"" Apr 23 09:31:19.356797 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.356545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 09:31:19.378309 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.378454 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxgd7\" (UniqueName: \"kubernetes.io/projected/208a4652-fd62-4c8b-b1b9-542601fb566c-kube-api-access-sxgd7\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.378454 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/208a4652-fd62-4c8b-b1b9-542601fb566c-tmp-dir\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.378562 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttt7\" (UniqueName: \"kubernetes.io/projected/198951b6-14d0-4d27-82e3-20e88b58ddc3-kube-api-access-2ttt7\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.378562 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/208a4652-fd62-4c8b-b1b9-542601fb566c-config-volume\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.378562 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.378712 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.378626 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:19.378712 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.378670 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:19.378795 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.378735 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls podName:208a4652-fd62-4c8b-b1b9-542601fb566c nodeName:}" failed. No retries permitted until 2026-04-23 09:31:19.878705747 +0000 UTC m=+33.142092883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls") pod "dns-default-w7jwq" (UID: "208a4652-fd62-4c8b-b1b9-542601fb566c") : secret "dns-default-metrics-tls" not found Apr 23 09:31:19.378795 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.378753 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert podName:198951b6-14d0-4d27-82e3-20e88b58ddc3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:19.878744336 +0000 UTC m=+33.142131465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert") pod "ingress-canary-tlwj2" (UID: "198951b6-14d0-4d27-82e3-20e88b58ddc3") : secret "canary-serving-cert" not found Apr 23 09:31:19.378867 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.378790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/208a4652-fd62-4c8b-b1b9-542601fb566c-tmp-dir\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.379064 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.379046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/208a4652-fd62-4c8b-b1b9-542601fb566c-config-volume\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.390840 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.390805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxgd7\" (UniqueName: \"kubernetes.io/projected/208a4652-fd62-4c8b-b1b9-542601fb566c-kube-api-access-sxgd7\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.391021 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.390845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttt7\" (UniqueName: \"kubernetes.io/projected/198951b6-14d0-4d27-82e3-20e88b58ddc3-kube-api-access-2ttt7\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.884723 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.884683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:19.884921 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.884809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:19.884921 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.884863 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:19.884921 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.884916 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:19.885071 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.884946 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert podName:198951b6-14d0-4d27-82e3-20e88b58ddc3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:20.884925156 +0000 UTC m=+34.148312298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert") pod "ingress-canary-tlwj2" (UID: "198951b6-14d0-4d27-82e3-20e88b58ddc3") : secret "canary-serving-cert" not found Apr 23 09:31:19.885071 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.884967 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls podName:208a4652-fd62-4c8b-b1b9-542601fb566c nodeName:}" failed. No retries permitted until 2026-04-23 09:31:20.884957768 +0000 UTC m=+34.148344898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls") pod "dns-default-w7jwq" (UID: "208a4652-fd62-4c8b-b1b9-542601fb566c") : secret "dns-default-metrics-tls" not found Apr 23 09:31:19.985961 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:19.985920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:19.986128 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.986073 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 09:31:19.986193 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:19.986142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs podName:9675e92f-c255-4d0a-a137-2fb828720d4d nodeName:}" failed. No retries permitted until 2026-04-23 09:31:51.986125762 +0000 UTC m=+65.249512890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs") pod "network-metrics-daemon-nfwtj" (UID: "9675e92f-c255-4d0a-a137-2fb828720d4d") : secret "metrics-daemon-secret" not found Apr 23 09:31:20.086951 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:20.086910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:20.089446 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:20.089425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwfkn\" (UniqueName: \"kubernetes.io/projected/83e2352e-7188-4529-a79a-11d59e36b30b-kube-api-access-cwfkn\") pod \"network-check-target-t5mzg\" (UID: \"83e2352e-7188-4529-a79a-11d59e36b30b\") " pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:20.265290 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:20.265199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:20.496609 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:20.496383 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t5mzg"] Apr 23 09:31:20.503327 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:20.503297 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e2352e_7188_4529_a79a_11d59e36b30b.slice/crio-83e066823b90d72c6472bc0f26318d2f7b78e725e605bf58ae21bb02891021fa WatchSource:0}: Error finding container 83e066823b90d72c6472bc0f26318d2f7b78e725e605bf58ae21bb02891021fa: Status 404 returned error can't find the container with id 83e066823b90d72c6472bc0f26318d2f7b78e725e605bf58ae21bb02891021fa Apr 23 09:31:20.893606 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:20.893509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:20.893606 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:20.893584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:20.893865 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:20.893664 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:20.893865 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:20.893738 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert podName:198951b6-14d0-4d27-82e3-20e88b58ddc3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:22.89371907 +0000 UTC m=+36.157106197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert") pod "ingress-canary-tlwj2" (UID: "198951b6-14d0-4d27-82e3-20e88b58ddc3") : secret "canary-serving-cert" not found Apr 23 09:31:20.893865 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:20.893675 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:20.893865 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:20.893802 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls podName:208a4652-fd62-4c8b-b1b9-542601fb566c nodeName:}" failed. No retries permitted until 2026-04-23 09:31:22.893788563 +0000 UTC m=+36.157175694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls") pod "dns-default-w7jwq" (UID: "208a4652-fd62-4c8b-b1b9-542601fb566c") : secret "dns-default-metrics-tls" not found Apr 23 09:31:21.499693 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:21.499652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t5mzg" event={"ID":"83e2352e-7188-4529-a79a-11d59e36b30b","Type":"ContainerStarted","Data":"83e066823b90d72c6472bc0f26318d2f7b78e725e605bf58ae21bb02891021fa"} Apr 23 09:31:21.502415 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:21.502386 2575 generic.go:358] "Generic (PLEG): container finished" podID="d3bd2bf5-e581-45d4-b978-acec1bd86ea3" containerID="2c0147f483c77084ec52a789b697236d6658f7c3ea533f586568cef810576a53" exitCode=0 Apr 23 09:31:21.502556 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:21.502433 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerDied","Data":"2c0147f483c77084ec52a789b697236d6658f7c3ea533f586568cef810576a53"} Apr 23 09:31:22.508395 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:22.508361 2575 generic.go:358] "Generic (PLEG): container finished" podID="d3bd2bf5-e581-45d4-b978-acec1bd86ea3" containerID="c04f5555c5f9f5e007f45ac17b694954fb4dea3c4982db74acac6d031629775c" exitCode=0 Apr 23 09:31:22.508809 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:22.508411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerDied","Data":"c04f5555c5f9f5e007f45ac17b694954fb4dea3c4982db74acac6d031629775c"} Apr 23 09:31:22.907971 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:22.907928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:22.908166 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:22.907985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:22.908166 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:22.908084 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:22.908166 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:22.908091 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:22.908166 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:22.908163 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert podName:198951b6-14d0-4d27-82e3-20e88b58ddc3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:26.908142734 +0000 UTC m=+40.171529864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert") pod "ingress-canary-tlwj2" (UID: "198951b6-14d0-4d27-82e3-20e88b58ddc3") : secret "canary-serving-cert" not found Apr 23 09:31:22.908383 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:22.908203 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls podName:208a4652-fd62-4c8b-b1b9-542601fb566c nodeName:}" failed. No retries permitted until 2026-04-23 09:31:26.908192406 +0000 UTC m=+40.171579541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls") pod "dns-default-w7jwq" (UID: "208a4652-fd62-4c8b-b1b9-542601fb566c") : secret "dns-default-metrics-tls" not found Apr 23 09:31:23.513462 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:23.513417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" event={"ID":"d3bd2bf5-e581-45d4-b978-acec1bd86ea3","Type":"ContainerStarted","Data":"f8fd11ce95ce3117256cdbf8297ad9ad1ca1302cf75e9ddd76aea6113c5c3ad7"} Apr 23 09:31:23.542188 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:23.542109 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mvdjb" podStartSLOduration=4.799757679 podStartE2EDuration="36.542090158s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:30:48.615002935 +0000 UTC m=+1.878390062" lastFinishedPulling="2026-04-23 09:31:20.35733541 +0000 UTC m=+33.620722541" observedRunningTime="2026-04-23 09:31:23.540293032 +0000 UTC m=+36.803680186" watchObservedRunningTime="2026-04-23 09:31:23.542090158 +0000 UTC m=+36.805477309" Apr 23 09:31:24.516558 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:24.516520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t5mzg" event={"ID":"83e2352e-7188-4529-a79a-11d59e36b30b","Type":"ContainerStarted","Data":"544dea4cb198bc576bfe95de72c8813a2ccdd9fdba9c9f8f4cae22b2c66d83f0"} Apr 23 09:31:24.517002 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:24.516932 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:31:24.539269 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:24.539224 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t5mzg" podStartSLOduration=34.45202843 podStartE2EDuration="37.53920857s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:20.505193049 +0000 UTC m=+33.768580196" lastFinishedPulling="2026-04-23 09:31:23.592373194 +0000 UTC m=+36.855760336" observedRunningTime="2026-04-23 09:31:24.539066414 +0000 UTC m=+37.802453570" watchObservedRunningTime="2026-04-23 09:31:24.53920857 +0000 UTC m=+37.802595719" Apr 23 09:31:26.938192 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.938132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:26.938712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.938248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:26.938712 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:26.938360 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:26.938712 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:26.938416 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:26.938712 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:26.938447 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert podName:198951b6-14d0-4d27-82e3-20e88b58ddc3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:34.938426501 +0000 UTC m=+48.201813629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert") pod "ingress-canary-tlwj2" (UID: "198951b6-14d0-4d27-82e3-20e88b58ddc3") : secret "canary-serving-cert" not found Apr 23 09:31:26.938712 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:26.938474 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls podName:208a4652-fd62-4c8b-b1b9-542601fb566c nodeName:}" failed. No retries permitted until 2026-04-23 09:31:34.938457786 +0000 UTC m=+48.201844931 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls") pod "dns-default-w7jwq" (UID: "208a4652-fd62-4c8b-b1b9-542601fb566c") : secret "dns-default-metrics-tls" not found Apr 23 09:31:26.987359 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.987319 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp"] Apr 23 09:31:26.995621 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.995593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" Apr 23 09:31:26.998495 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.998465 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 09:31:26.998637 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.998535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6ggr7\"" Apr 23 09:31:26.999729 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:26.999696 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 09:31:27.000695 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.000677 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp"] Apr 23 09:31:27.139605 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.139567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hl5z\" (UniqueName: \"kubernetes.io/projected/d61a941a-d439-4bd9-ba1b-aa7f923ec15a-kube-api-access-5hl5z\") pod \"migrator-74bb7799d9-lqdgp\" (UID: \"d61a941a-d439-4bd9-ba1b-aa7f923ec15a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" Apr 23 09:31:27.240546 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.240468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hl5z\" (UniqueName: \"kubernetes.io/projected/d61a941a-d439-4bd9-ba1b-aa7f923ec15a-kube-api-access-5hl5z\") pod \"migrator-74bb7799d9-lqdgp\" (UID: \"d61a941a-d439-4bd9-ba1b-aa7f923ec15a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" Apr 23 09:31:27.250009 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.249968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hl5z\" (UniqueName: \"kubernetes.io/projected/d61a941a-d439-4bd9-ba1b-aa7f923ec15a-kube-api-access-5hl5z\") pod \"migrator-74bb7799d9-lqdgp\" (UID: \"d61a941a-d439-4bd9-ba1b-aa7f923ec15a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" Apr 23 09:31:27.304901 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.304861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" Apr 23 09:31:27.424145 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.424112 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp"] Apr 23 09:31:27.427641 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:27.427617 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61a941a_d439_4bd9_ba1b_aa7f923ec15a.slice/crio-4e8f7fafbef948602e9bcabd718b60e45da7405a4d04bc2735131f81c1ba0dd8 WatchSource:0}: Error finding container 4e8f7fafbef948602e9bcabd718b60e45da7405a4d04bc2735131f81c1ba0dd8: Status 404 returned error can't find the container with id 4e8f7fafbef948602e9bcabd718b60e45da7405a4d04bc2735131f81c1ba0dd8 Apr 23 09:31:27.524434 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.524353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" event={"ID":"d61a941a-d439-4bd9-ba1b-aa7f923ec15a","Type":"ContainerStarted","Data":"4e8f7fafbef948602e9bcabd718b60e45da7405a4d04bc2735131f81c1ba0dd8"} Apr 23 09:31:27.913973 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:27.913942 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xvcqj_bbce7cd3-954b-41fd-b6c9-9f47fee30477/dns-node-resolver/0.log" Apr 23 09:31:28.912834 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:28.912806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qbs24_7f30d2db-f0de-4938-9291-99d089bb41d8/node-ca/0.log" Apr 23 09:31:29.529571 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:29.529491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" event={"ID":"d61a941a-d439-4bd9-ba1b-aa7f923ec15a","Type":"ContainerStarted","Data":"b9094afd5b10fc2cfe79dda16611a26ecb6a0f63f6a5e0926c564f81a007ef59"} Apr 23 09:31:29.529571 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:29.529535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" event={"ID":"d61a941a-d439-4bd9-ba1b-aa7f923ec15a","Type":"ContainerStarted","Data":"bf8f3692fd490b6c2e7fc7cde86fba86d41b2e530f8a2a49397ed6aae040a709"} Apr 23 09:31:29.549771 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:29.549718 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lqdgp" podStartSLOduration=1.7094160600000001 podStartE2EDuration="3.549704072s" podCreationTimestamp="2026-04-23 09:31:26 +0000 UTC" firstStartedPulling="2026-04-23 09:31:27.42959877 +0000 UTC m=+40.692985901" lastFinishedPulling="2026-04-23 09:31:29.26988677 +0000 UTC m=+42.533273913" observedRunningTime="2026-04-23 09:31:29.54861852 +0000 UTC m=+42.812005670" watchObservedRunningTime="2026-04-23 09:31:29.549704072 +0000 UTC m=+42.813091223" Apr 23 09:31:34.995914 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:34.995879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:34.995914 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:34.995922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:34.996418 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:34.996028 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:34.996418 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:34.996036 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:34.996418 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:34.996084 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert podName:198951b6-14d0-4d27-82e3-20e88b58ddc3 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:50.99606782 +0000 UTC m=+64.259454948 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert") pod "ingress-canary-tlwj2" (UID: "198951b6-14d0-4d27-82e3-20e88b58ddc3") : secret "canary-serving-cert" not found Apr 23 09:31:34.996418 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:31:34.996096 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls podName:208a4652-fd62-4c8b-b1b9-542601fb566c nodeName:}" failed. No retries permitted until 2026-04-23 09:31:50.99609004 +0000 UTC m=+64.259477168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls") pod "dns-default-w7jwq" (UID: "208a4652-fd62-4c8b-b1b9-542601fb566c") : secret "dns-default-metrics-tls" not found Apr 23 09:31:46.498877 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:46.498843 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7lqz" Apr 23 09:31:47.841528 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.841493 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fjgbq"] Apr 23 09:31:47.845867 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.845846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:47.849538 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.849514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qqv29\"" Apr 23 09:31:47.849735 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.849722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 09:31:47.850713 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.850685 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 09:31:47.850831 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.850751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 09:31:47.850831 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.850752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 09:31:47.865497 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.865468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fjgbq"] Apr 23 09:31:47.941684 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.941643 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-zdw5m"] Apr 23 09:31:47.945486 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.945440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:31:47.948411 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.948384 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mfh7v\"" Apr 23 09:31:47.948685 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.948669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 09:31:47.951298 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.951271 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 09:31:47.961899 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.961870 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zdw5m"] Apr 23 09:31:47.988167 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.988132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/301171ee-bdb8-44e3-a27c-edc3a9a811c3-crio-socket\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:47.988167 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.988166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/301171ee-bdb8-44e3-a27c-edc3a9a811c3-data-volume\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:47.988391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.988205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/301171ee-bdb8-44e3-a27c-edc3a9a811c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:47.988391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.988223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnr2s\" (UniqueName: \"kubernetes.io/projected/301171ee-bdb8-44e3-a27c-edc3a9a811c3-kube-api-access-bnr2s\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:47.988391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:47.988333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/301171ee-bdb8-44e3-a27c-edc3a9a811c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.038789 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.038759 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64d8c47674-2l6rs"] Apr 23 09:31:48.041762 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.041748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.044429 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.044411 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 09:31:48.044763 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.044748 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 09:31:48.044993 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.044981 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-shblk\"" Apr 23 09:31:48.045374 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.045361 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 09:31:48.051035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.051011 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 09:31:48.052369 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.052346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64d8c47674-2l6rs"] Apr 23 09:31:48.088659 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5gc9\" (UniqueName: \"kubernetes.io/projected/6dc929b3-11e5-41ca-90f0-b281ba8c1567-kube-api-access-t5gc9\") pod \"downloads-6bcc868b7-zdw5m\" (UID: \"6dc929b3-11e5-41ca-90f0-b281ba8c1567\") " pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:31:48.088857 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/301171ee-bdb8-44e3-a27c-edc3a9a811c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.088857 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/301171ee-bdb8-44e3-a27c-edc3a9a811c3-crio-socket\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.088857 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/301171ee-bdb8-44e3-a27c-edc3a9a811c3-data-volume\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.088857 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/301171ee-bdb8-44e3-a27c-edc3a9a811c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.088857 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnr2s\" (UniqueName: \"kubernetes.io/projected/301171ee-bdb8-44e3-a27c-edc3a9a811c3-kube-api-access-bnr2s\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.089210 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.088915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/301171ee-bdb8-44e3-a27c-edc3a9a811c3-crio-socket\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.089284 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.089230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/301171ee-bdb8-44e3-a27c-edc3a9a811c3-data-volume\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.089416 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.089397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/301171ee-bdb8-44e3-a27c-edc3a9a811c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.091156 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.091139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/301171ee-bdb8-44e3-a27c-edc3a9a811c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.101651 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.101590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnr2s\" (UniqueName: \"kubernetes.io/projected/301171ee-bdb8-44e3-a27c-edc3a9a811c3-kube-api-access-bnr2s\") pod \"insights-runtime-extractor-fjgbq\" (UID: \"301171ee-bdb8-44e3-a27c-edc3a9a811c3\") " pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.154498 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.154466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fjgbq" Apr 23 09:31:48.190247 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-image-registry-private-configuration\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190444 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-registry-tls\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190444 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-bound-sa-token\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190444 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-installation-pull-secrets\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190444 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-trusted-ca\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190444 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-ca-trust-extracted\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190674 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190455 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6f6\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-kube-api-access-hj6f6\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190674 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-registry-certificates\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.190674 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.190529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5gc9\" (UniqueName: \"kubernetes.io/projected/6dc929b3-11e5-41ca-90f0-b281ba8c1567-kube-api-access-t5gc9\") pod \"downloads-6bcc868b7-zdw5m\" (UID: \"6dc929b3-11e5-41ca-90f0-b281ba8c1567\") " pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:31:48.202938 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.202908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5gc9\" (UniqueName: \"kubernetes.io/projected/6dc929b3-11e5-41ca-90f0-b281ba8c1567-kube-api-access-t5gc9\") pod \"downloads-6bcc868b7-zdw5m\" (UID: \"6dc929b3-11e5-41ca-90f0-b281ba8c1567\") " pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:31:48.255442 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.255412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:31:48.288253 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.288222 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fjgbq"] Apr 23 09:31:48.291426 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-ca-trust-extracted\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291562 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6f6\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-kube-api-access-hj6f6\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291562 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-registry-certificates\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291562 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-image-registry-private-configuration\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291708 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-registry-tls\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291708 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-bound-sa-token\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291708 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-installation-pull-secrets\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291708 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-trusted-ca\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291902 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.291786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-ca-trust-extracted\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.291902 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:48.291824 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301171ee_bdb8_44e3_a27c_edc3a9a811c3.slice/crio-369e7668b436af17b24978731a8627d2318cfd2635604ad450129639a8453b38 WatchSource:0}: Error finding container 369e7668b436af17b24978731a8627d2318cfd2635604ad450129639a8453b38: Status 404 returned error can't find the container with id 369e7668b436af17b24978731a8627d2318cfd2635604ad450129639a8453b38 Apr 23 09:31:48.294102 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.292728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-trusted-ca\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.294102 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.293106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-registry-certificates\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.295312 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.295166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-installation-pull-secrets\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.295312 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.295197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-image-registry-private-configuration\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.295773 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.295755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-registry-tls\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.304070 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.303790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-bound-sa-token\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.305705 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.305525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6f6\" (UniqueName: \"kubernetes.io/projected/f9dd5ac4-f74f-439e-8fb8-9f280d70427b-kube-api-access-hj6f6\") pod \"image-registry-64d8c47674-2l6rs\" (UID: \"f9dd5ac4-f74f-439e-8fb8-9f280d70427b\") " pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.351340 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.351313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.381821 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.381783 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zdw5m"] Apr 23 09:31:48.385566 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:48.385531 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dc929b3_11e5_41ca_90f0_b281ba8c1567.slice/crio-25588cd5cef6307efb608248d64c6f5308fddd715b74c45c626dda1755db3725 WatchSource:0}: Error finding container 25588cd5cef6307efb608248d64c6f5308fddd715b74c45c626dda1755db3725: Status 404 returned error can't find the container with id 25588cd5cef6307efb608248d64c6f5308fddd715b74c45c626dda1755db3725 Apr 23 09:31:48.472217 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.472171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64d8c47674-2l6rs"] Apr 23 09:31:48.475379 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:48.475353 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9dd5ac4_f74f_439e_8fb8_9f280d70427b.slice/crio-0622e429905e3ca41dc98633d91835ac5d729cd3882cda5b6caa39bf01365adc WatchSource:0}: Error finding container 0622e429905e3ca41dc98633d91835ac5d729cd3882cda5b6caa39bf01365adc: Status 404 returned error can't find the container with id 0622e429905e3ca41dc98633d91835ac5d729cd3882cda5b6caa39bf01365adc Apr 23 09:31:48.567001 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.566945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zdw5m" event={"ID":"6dc929b3-11e5-41ca-90f0-b281ba8c1567","Type":"ContainerStarted","Data":"25588cd5cef6307efb608248d64c6f5308fddd715b74c45c626dda1755db3725"} Apr 23 09:31:48.568504 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.568464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjgbq" event={"ID":"301171ee-bdb8-44e3-a27c-edc3a9a811c3","Type":"ContainerStarted","Data":"e40df7d475bdd808340b089dac91fef0fff2c5ab08c2981f1ff7911077f0f37d"} Apr 23 09:31:48.568504 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.568511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjgbq" event={"ID":"301171ee-bdb8-44e3-a27c-edc3a9a811c3","Type":"ContainerStarted","Data":"369e7668b436af17b24978731a8627d2318cfd2635604ad450129639a8453b38"} Apr 23 09:31:48.569678 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.569651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" event={"ID":"f9dd5ac4-f74f-439e-8fb8-9f280d70427b","Type":"ContainerStarted","Data":"9f684b7ddf19ced56353a6bad18ba778341745d6c8c1fcf380e577e3fcd5d10a"} Apr 23 09:31:48.569678 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.569681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" event={"ID":"f9dd5ac4-f74f-439e-8fb8-9f280d70427b","Type":"ContainerStarted","Data":"0622e429905e3ca41dc98633d91835ac5d729cd3882cda5b6caa39bf01365adc"} Apr 23 09:31:48.569874 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.569819 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:31:48.590139 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:48.590077 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" podStartSLOduration=0.590061338 podStartE2EDuration="590.061338ms" podCreationTimestamp="2026-04-23 09:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:48.589021583 +0000 UTC m=+61.852408734" watchObservedRunningTime="2026-04-23 09:31:48.590061338 +0000 UTC m=+61.853448487" Apr 23 09:31:49.574121 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:49.574077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjgbq" event={"ID":"301171ee-bdb8-44e3-a27c-edc3a9a811c3","Type":"ContainerStarted","Data":"412c051a2c0111f42f9e65d628d7ba093886d3a4da02ba3c639ed70006ba7a76"} Apr 23 09:31:51.015080 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.015034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:51.015446 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.015226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:51.017404 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.017376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/208a4652-fd62-4c8b-b1b9-542601fb566c-metrics-tls\") pod \"dns-default-w7jwq\" (UID: \"208a4652-fd62-4c8b-b1b9-542601fb566c\") " pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:51.017517 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.017498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/198951b6-14d0-4d27-82e3-20e88b58ddc3-cert\") pod \"ingress-canary-tlwj2\" (UID: \"198951b6-14d0-4d27-82e3-20e88b58ddc3\") " pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:51.243523 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.243489 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvkwp\"" Apr 23 09:31:51.251212 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.251170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:51.257829 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.257803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mw7t9\"" Apr 23 09:31:51.265009 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.264978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tlwj2" Apr 23 09:31:51.536398 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.536364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tlwj2"] Apr 23 09:31:51.540668 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:51.540633 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod198951b6_14d0_4d27_82e3_20e88b58ddc3.slice/crio-4c19f25c5fe05863518e3392fccb8835cbe61775d880be78724bcba685bcfdf0 WatchSource:0}: Error finding container 4c19f25c5fe05863518e3392fccb8835cbe61775d880be78724bcba685bcfdf0: Status 404 returned error can't find the container with id 4c19f25c5fe05863518e3392fccb8835cbe61775d880be78724bcba685bcfdf0 Apr 23 09:31:51.554806 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.554777 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w7jwq"] Apr 23 09:31:51.558403 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:51.558366 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208a4652_fd62_4c8b_b1b9_542601fb566c.slice/crio-a92797d8e7dcb4f6d9c077bb9c0540d9bc2b52fd71d81b64c447ff7324f86d66 WatchSource:0}: Error finding container a92797d8e7dcb4f6d9c077bb9c0540d9bc2b52fd71d81b64c447ff7324f86d66: Status 404 returned error can't find the container with id a92797d8e7dcb4f6d9c077bb9c0540d9bc2b52fd71d81b64c447ff7324f86d66 Apr 23 09:31:51.581826 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.581787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w7jwq" event={"ID":"208a4652-fd62-4c8b-b1b9-542601fb566c","Type":"ContainerStarted","Data":"a92797d8e7dcb4f6d9c077bb9c0540d9bc2b52fd71d81b64c447ff7324f86d66"} Apr 23 09:31:51.584328 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.584294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjgbq" event={"ID":"301171ee-bdb8-44e3-a27c-edc3a9a811c3","Type":"ContainerStarted","Data":"ecebd79db9f211e8f7758aff41b99c0043e43b0f132c3395e76e8cb70387b08d"} Apr 23 09:31:51.585560 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.585532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tlwj2" event={"ID":"198951b6-14d0-4d27-82e3-20e88b58ddc3","Type":"ContainerStarted","Data":"4c19f25c5fe05863518e3392fccb8835cbe61775d880be78724bcba685bcfdf0"} Apr 23 09:31:51.606708 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:51.606645 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fjgbq" podStartSLOduration=1.569365299 podStartE2EDuration="4.606625546s" podCreationTimestamp="2026-04-23 09:31:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:48.369337569 +0000 UTC m=+61.632724711" lastFinishedPulling="2026-04-23 09:31:51.40659781 +0000 UTC m=+64.669984958" observedRunningTime="2026-04-23 09:31:51.606025738 +0000 UTC m=+64.869412889" watchObservedRunningTime="2026-04-23 09:31:51.606625546 +0000 UTC m=+64.870012689" Apr 23 09:31:52.022036 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:52.021993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:52.024614 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:52.024584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9675e92f-c255-4d0a-a137-2fb828720d4d-metrics-certs\") pod \"network-metrics-daemon-nfwtj\" (UID: \"9675e92f-c255-4d0a-a137-2fb828720d4d\") " pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:52.073617 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:52.073580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gg4dj\"" Apr 23 09:31:52.081959 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:52.081934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nfwtj" Apr 23 09:31:52.213121 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:52.213087 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nfwtj"] Apr 23 09:31:52.216599 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:31:52.216558 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9675e92f_c255_4d0a_a137_2fb828720d4d.slice/crio-9d159aaa0eda440f57de98fac644bc652e8bf6d52c75958148b4f119c5370986 WatchSource:0}: Error finding container 9d159aaa0eda440f57de98fac644bc652e8bf6d52c75958148b4f119c5370986: Status 404 returned error can't find the container with id 9d159aaa0eda440f57de98fac644bc652e8bf6d52c75958148b4f119c5370986 Apr 23 09:31:52.591316 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:52.591263 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nfwtj" event={"ID":"9675e92f-c255-4d0a-a137-2fb828720d4d","Type":"ContainerStarted","Data":"9d159aaa0eda440f57de98fac644bc652e8bf6d52c75958148b4f119c5370986"} Apr 23 09:31:55.601048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.601006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w7jwq" event={"ID":"208a4652-fd62-4c8b-b1b9-542601fb566c","Type":"ContainerStarted","Data":"582fe3969522dc95abab022de43b3844a214a38f5774243467748093793608fd"} Apr 23 09:31:55.601048 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.601051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w7jwq" event={"ID":"208a4652-fd62-4c8b-b1b9-542601fb566c","Type":"ContainerStarted","Data":"65ec7408214387c45ef5226ade19c8e9263b5242833f763c482a689beb5a4639"} Apr 23 09:31:55.601609 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.601113 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w7jwq" Apr 23 09:31:55.602709 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.602680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nfwtj" event={"ID":"9675e92f-c255-4d0a-a137-2fb828720d4d","Type":"ContainerStarted","Data":"31fb545487c3bb8aabbe7b49b11e154fe7abc3d84cd0c003b6b950e80f1c467d"} Apr 23 09:31:55.602830 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.602717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nfwtj" event={"ID":"9675e92f-c255-4d0a-a137-2fb828720d4d","Type":"ContainerStarted","Data":"dbefe6a6d9785b169a4e1eb1462b696cfbb51dc9722d4dafc29cddcabd7cbbe8"} Apr 23 09:31:55.604141 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.604114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tlwj2" event={"ID":"198951b6-14d0-4d27-82e3-20e88b58ddc3","Type":"ContainerStarted","Data":"02dcff3f7af30fedf3051e09521f14156d65b4068ae68eff77959f406f4674c1"} Apr 23 09:31:55.619648 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.619593 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w7jwq" podStartSLOduration=33.271468194 podStartE2EDuration="36.619575838s" podCreationTimestamp="2026-04-23 09:31:19 +0000 UTC" firstStartedPulling="2026-04-23 09:31:51.560910569 +0000 UTC m=+64.824297699" lastFinishedPulling="2026-04-23 09:31:54.909018209 +0000 UTC m=+68.172405343" observedRunningTime="2026-04-23 09:31:55.618247691 +0000 UTC m=+68.881634845" watchObservedRunningTime="2026-04-23 09:31:55.619575838 +0000 UTC m=+68.882962988" Apr 23 09:31:55.648879 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.648817 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nfwtj" podStartSLOduration=65.95865705 podStartE2EDuration="1m8.648797598s" podCreationTimestamp="2026-04-23 09:30:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:52.218878963 +0000 UTC m=+65.482266094" lastFinishedPulling="2026-04-23 09:31:54.909019497 +0000 UTC m=+68.172406642" observedRunningTime="2026-04-23 09:31:55.647759825 +0000 UTC m=+68.911146976" watchObservedRunningTime="2026-04-23 09:31:55.648797598 +0000 UTC m=+68.912184746" Apr 23 09:31:55.649043 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:55.648932 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tlwj2" podStartSLOduration=33.276336021 podStartE2EDuration="36.64892541s" podCreationTimestamp="2026-04-23 09:31:19 +0000 UTC" firstStartedPulling="2026-04-23 09:31:51.542894323 +0000 UTC m=+64.806281453" lastFinishedPulling="2026-04-23 09:31:54.9154837 +0000 UTC m=+68.178870842" observedRunningTime="2026-04-23 09:31:55.633087115 +0000 UTC m=+68.896474267" watchObservedRunningTime="2026-04-23 09:31:55.64892541 +0000 UTC m=+68.912312561" Apr 23 09:31:56.524117 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:31:56.524084 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t5mzg" Apr 23 09:32:05.609696 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:05.609589 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w7jwq" Apr 23 09:32:06.644692 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:06.644650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zdw5m" event={"ID":"6dc929b3-11e5-41ca-90f0-b281ba8c1567","Type":"ContainerStarted","Data":"70517b94edca6eb5fbb635ba9d3210a0f434a3875098ac692a48bb26742ec814"} Apr 23 09:32:06.645215 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:06.645088 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:32:06.646645 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:06.646603 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-zdw5m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.9:8080/\": dial tcp 10.134.0.9:8080: connect: connection refused" start-of-body= Apr 23 09:32:06.646754 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:06.646674 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-zdw5m" podUID="6dc929b3-11e5-41ca-90f0-b281ba8c1567" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.9:8080/\": dial tcp 10.134.0.9:8080: connect: connection refused" Apr 23 09:32:06.663703 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:06.663646 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-zdw5m" podStartSLOduration=1.580297895 podStartE2EDuration="19.66363088s" podCreationTimestamp="2026-04-23 09:31:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:48.388802848 +0000 UTC m=+61.652189981" lastFinishedPulling="2026-04-23 09:32:06.472135802 +0000 UTC m=+79.735522966" observedRunningTime="2026-04-23 09:32:06.661415929 +0000 UTC m=+79.924803085" watchObservedRunningTime="2026-04-23 09:32:06.66363088 +0000 UTC m=+79.927018068" Apr 23 09:32:07.664342 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:07.664307 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-zdw5m" Apr 23 09:32:09.579439 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:09.579407 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64d8c47674-2l6rs" Apr 23 09:32:24.535157 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.535122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2"] Apr 23 09:32:24.577677 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.577648 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vndrs"] Apr 23 09:32:24.577826 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.577810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.580607 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.580581 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 09:32:24.580770 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.580614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 09:32:24.580770 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.580580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 09:32:24.580770 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.580580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 09:32:24.581601 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.581585 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 09:32:24.581691 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.581617 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8k85n\"" Apr 23 09:32:24.591977 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.591957 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2"] Apr 23 09:32:24.591977 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.591980 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qtccq"] Apr 23 09:32:24.592127 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.592104 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.594725 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.594705 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cdnqx\"" Apr 23 09:32:24.595186 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.595133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 09:32:24.595288 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.595267 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 09:32:24.595350 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.595304 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 09:32:24.619815 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.619791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qtccq"] Apr 23 09:32:24.619952 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.619912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.623420 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.623397 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 09:32:24.623554 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.623446 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-h4mw9\"" Apr 23 09:32:24.623554 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.623397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 09:32:24.623704 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.623397 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 09:32:24.652733 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.652733 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-accelerators-collector-config\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.652939 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-root\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.652939 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-textfile\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.652939 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-sys\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.652939 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.653060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.653060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw2g7\" (UniqueName: \"kubernetes.io/projected/44f52516-aac2-4cb9-8344-3113b9705130-kube-api-access-qw2g7\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.653060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.652998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh8tw\" (UniqueName: \"kubernetes.io/projected/6f8dff77-f950-4d94-b370-b3392941773d-kube-api-access-lh8tw\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.653060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f8dff77-f950-4d94-b370-b3392941773d-metrics-client-ca\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.653060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-tls\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.653249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.653249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/566b4995-5609-43dd-8655-9d5459456f3e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.653249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44f52516-aac2-4cb9-8344-3113b9705130-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.653249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.653249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/44f52516-aac2-4cb9-8344-3113b9705130-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.653249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.653428 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-wtmp\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.653428 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.653278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdczn\" (UniqueName: \"kubernetes.io/projected/566b4995-5609-43dd-8655-9d5459456f3e-kube-api-access-qdczn\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.754398 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.754398 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-accelerators-collector-config\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-root\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-textfile\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-sys\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-root\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-sys\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.754618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw2g7\" (UniqueName: \"kubernetes.io/projected/44f52516-aac2-4cb9-8344-3113b9705130-kube-api-access-qw2g7\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh8tw\" (UniqueName: \"kubernetes.io/projected/6f8dff77-f950-4d94-b370-b3392941773d-kube-api-access-lh8tw\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f8dff77-f950-4d94-b370-b3392941773d-metrics-client-ca\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-tls\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:24.754701 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/566b4995-5609-43dd-8655-9d5459456f3e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44f52516-aac2-4cb9-8344-3113b9705130-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:24.754799 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/44f52516-aac2-4cb9-8344-3113b9705130-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-wtmp\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:24.754883 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-tls podName:6f8dff77-f950-4d94-b370-b3392941773d nodeName:}" failed. No retries permitted until 2026-04-23 09:32:25.254861261 +0000 UTC m=+98.518248404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-tls") pod "node-exporter-vndrs" (UID: "6f8dff77-f950-4d94-b370-b3392941773d") : secret "node-exporter-tls" not found Apr 23 09:32:24.754913 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdczn\" (UniqueName: \"kubernetes.io/projected/566b4995-5609-43dd-8655-9d5459456f3e-kube-api-access-qdczn\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.755542 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:24.754962 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-tls podName:44f52516-aac2-4cb9-8344-3113b9705130 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:25.254947309 +0000 UTC m=+98.518334451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-qtccq" (UID: "44f52516-aac2-4cb9-8344-3113b9705130") : secret "kube-state-metrics-tls" not found Apr 23 09:32:24.755542 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.754985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-wtmp\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.755542 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:24.755065 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 09:32:24.755542 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:24.755119 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls podName:566b4995-5609-43dd-8655-9d5459456f3e nodeName:}" failed. No retries permitted until 2026-04-23 09:32:25.255103233 +0000 UTC m=+98.518490391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-fxpl2" (UID: "566b4995-5609-43dd-8655-9d5459456f3e") : secret "openshift-state-metrics-tls" not found Apr 23 09:32:24.755542 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.755251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.755542 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.755492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/44f52516-aac2-4cb9-8344-3113b9705130-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.755835 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.755705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/566b4995-5609-43dd-8655-9d5459456f3e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.755835 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.755801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44f52516-aac2-4cb9-8344-3113b9705130-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.757276 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.757245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:24.757678 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.757656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.764898 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.764862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-textfile\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.765129 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.765101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-accelerators-collector-config\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.765236 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.765130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f8dff77-f950-4d94-b370-b3392941773d-metrics-client-ca\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.766766 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.766739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.766923 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.766898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh8tw\" (UniqueName: \"kubernetes.io/projected/6f8dff77-f950-4d94-b370-b3392941773d-kube-api-access-lh8tw\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:24.772457 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.772406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw2g7\" (UniqueName: \"kubernetes.io/projected/44f52516-aac2-4cb9-8344-3113b9705130-kube-api-access-qw2g7\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:24.773724 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:24.773677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdczn\" (UniqueName: \"kubernetes.io/projected/566b4995-5609-43dd-8655-9d5459456f3e-kube-api-access-qdczn\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:25.258365 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.258327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:25.258556 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.258375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-tls\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:25.258556 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.258413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:25.258556 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:25.258495 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 09:32:25.258721 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:32:25.258572 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls podName:566b4995-5609-43dd-8655-9d5459456f3e nodeName:}" failed. No retries permitted until 2026-04-23 09:32:26.258553839 +0000 UTC m=+99.521940982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-fxpl2" (UID: "566b4995-5609-43dd-8655-9d5459456f3e") : secret "openshift-state-metrics-tls" not found Apr 23 09:32:25.260851 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.260819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f8dff77-f950-4d94-b370-b3392941773d-node-exporter-tls\") pod \"node-exporter-vndrs\" (UID: \"6f8dff77-f950-4d94-b370-b3392941773d\") " pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:25.260851 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.260843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44f52516-aac2-4cb9-8344-3113b9705130-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qtccq\" (UID: \"44f52516-aac2-4cb9-8344-3113b9705130\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:25.500269 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.500233 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vndrs" Apr 23 09:32:25.507810 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:32:25.507780 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8dff77_f950_4d94_b370_b3392941773d.slice/crio-33610a793e790d2fce504dcad4ed0a5fbd8a33389f9c549914293f9efd190dde WatchSource:0}: Error finding container 33610a793e790d2fce504dcad4ed0a5fbd8a33389f9c549914293f9efd190dde: Status 404 returned error can't find the container with id 33610a793e790d2fce504dcad4ed0a5fbd8a33389f9c549914293f9efd190dde Apr 23 09:32:25.528747 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.528721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" Apr 23 09:32:25.647442 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.647411 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qtccq"] Apr 23 09:32:25.650231 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:32:25.650198 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f52516_aac2_4cb9_8344_3113b9705130.slice/crio-f1879d533bc8da9b96773064cda2d6b0b2bba87f062c5a9a5a2394cfc695929c WatchSource:0}: Error finding container f1879d533bc8da9b96773064cda2d6b0b2bba87f062c5a9a5a2394cfc695929c: Status 404 returned error can't find the container with id f1879d533bc8da9b96773064cda2d6b0b2bba87f062c5a9a5a2394cfc695929c Apr 23 09:32:25.707495 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.707452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vndrs" event={"ID":"6f8dff77-f950-4d94-b370-b3392941773d","Type":"ContainerStarted","Data":"33610a793e790d2fce504dcad4ed0a5fbd8a33389f9c549914293f9efd190dde"} Apr 23 09:32:25.708480 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:25.708456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" event={"ID":"44f52516-aac2-4cb9-8344-3113b9705130","Type":"ContainerStarted","Data":"f1879d533bc8da9b96773064cda2d6b0b2bba87f062c5a9a5a2394cfc695929c"} Apr 23 09:32:26.267683 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.267650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:26.270306 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.270278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/566b4995-5609-43dd-8655-9d5459456f3e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fxpl2\" (UID: \"566b4995-5609-43dd-8655-9d5459456f3e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:26.386708 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.386672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" Apr 23 09:32:26.522325 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.522237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2"] Apr 23 09:32:26.560977 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.560944 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5f74d46c95-jtbt9"] Apr 23 09:32:26.564808 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.564784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.567811 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.567760 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 09:32:26.567811 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.567776 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 09:32:26.568029 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.567843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 09:32:26.568029 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.567786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2geqrum5fge0e\"" Apr 23 09:32:26.568029 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.567764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-kfv2w\"" Apr 23 09:32:26.568223 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.568048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 09:32:26.568223 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.568215 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 09:32:26.582142 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.582093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f74d46c95-jtbt9"] Apr 23 09:32:26.670592 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-grpc-tls\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-tls\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcldz\" (UniqueName: \"kubernetes.io/projected/334e1be2-d4fe-48bc-8763-cb2bf329d81d-kube-api-access-fcldz\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/334e1be2-d4fe-48bc-8763-cb2bf329d81d-metrics-client-ca\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.671005 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.670963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.713688 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.713646 2575 generic.go:358] "Generic (PLEG): container finished" podID="6f8dff77-f950-4d94-b370-b3392941773d" containerID="e40cded1b5d21d24ef4a76c4da7471c866f5c76a02613591e306555385ba1f1d" exitCode=0 Apr 23 09:32:26.713851 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.713721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vndrs" event={"ID":"6f8dff77-f950-4d94-b370-b3392941773d","Type":"ContainerDied","Data":"e40cded1b5d21d24ef4a76c4da7471c866f5c76a02613591e306555385ba1f1d"} Apr 23 09:32:26.751296 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:32:26.751257 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod566b4995_5609_43dd_8655_9d5459456f3e.slice/crio-76b5c026b46da5ba6f65ece4fc8c11b555b46ba0ff96d1eeea5b54f3a2b76f75 WatchSource:0}: Error finding container 76b5c026b46da5ba6f65ece4fc8c11b555b46ba0ff96d1eeea5b54f3a2b76f75: Status 404 returned error can't find the container with id 76b5c026b46da5ba6f65ece4fc8c11b555b46ba0ff96d1eeea5b54f3a2b76f75 Apr 23 09:32:26.771723 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.771696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcldz\" (UniqueName: \"kubernetes.io/projected/334e1be2-d4fe-48bc-8763-cb2bf329d81d-kube-api-access-fcldz\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.771846 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.771730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.771846 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.771764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/334e1be2-d4fe-48bc-8763-cb2bf329d81d-metrics-client-ca\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.771846 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.771805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.772305 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.772241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.772305 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.772287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.772480 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.772368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-grpc-tls\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.772480 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.772408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-tls\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.772595 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.772492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/334e1be2-d4fe-48bc-8763-cb2bf329d81d-metrics-client-ca\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.775013 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.774966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.775013 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.775000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.775303 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.775279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-tls\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.775409 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.775356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.775540 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.775515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-grpc-tls\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.775853 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.775833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/334e1be2-d4fe-48bc-8763-cb2bf329d81d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.780342 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.780320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcldz\" (UniqueName: \"kubernetes.io/projected/334e1be2-d4fe-48bc-8763-cb2bf329d81d-kube-api-access-fcldz\") pod \"thanos-querier-5f74d46c95-jtbt9\" (UID: \"334e1be2-d4fe-48bc-8763-cb2bf329d81d\") " pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:26.875675 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:26.875648 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:27.051248 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.051193 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f74d46c95-jtbt9"] Apr 23 09:32:27.056591 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:32:27.056559 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod334e1be2_d4fe_48bc_8763_cb2bf329d81d.slice/crio-ecf0b0c0bc193a8bc31de18bce8f3aca7b38b7b761eeeeba77e4e1bad41ff5e7 WatchSource:0}: Error finding container ecf0b0c0bc193a8bc31de18bce8f3aca7b38b7b761eeeeba77e4e1bad41ff5e7: Status 404 returned error can't find the container with id ecf0b0c0bc193a8bc31de18bce8f3aca7b38b7b761eeeeba77e4e1bad41ff5e7 Apr 23 09:32:27.720139 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.719927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" event={"ID":"44f52516-aac2-4cb9-8344-3113b9705130","Type":"ContainerStarted","Data":"11b48a3adbcbd1e52836d1d916d3daa33152934ae7e1f709f95fa618218da614"} Apr 23 09:32:27.720139 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.719984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" event={"ID":"44f52516-aac2-4cb9-8344-3113b9705130","Type":"ContainerStarted","Data":"fa51ba38fe6d9d512c48a58efaeb21bc6332577db690c0b2fb2176be6a5568d8"} Apr 23 09:32:27.720139 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.720000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" event={"ID":"44f52516-aac2-4cb9-8344-3113b9705130","Type":"ContainerStarted","Data":"31c337a9e6b86ab9039f813b787b3e2a9be2f448582a02a52d7e757ca7b02c13"} Apr 23 09:32:27.721587 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.721558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"ecf0b0c0bc193a8bc31de18bce8f3aca7b38b7b761eeeeba77e4e1bad41ff5e7"} Apr 23 09:32:27.723627 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.723600 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vndrs" event={"ID":"6f8dff77-f950-4d94-b370-b3392941773d","Type":"ContainerStarted","Data":"87f5263b5cfa124c859a7221d1417779c1106f395dc54a8e205e3fd12da59bc0"} Apr 23 09:32:27.723736 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.723633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vndrs" event={"ID":"6f8dff77-f950-4d94-b370-b3392941773d","Type":"ContainerStarted","Data":"3d7fd10607712a97478181cbdaf353039dcbff553312b4f59925f6ec7b118e8f"} Apr 23 09:32:27.726124 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.726098 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" event={"ID":"566b4995-5609-43dd-8655-9d5459456f3e","Type":"ContainerStarted","Data":"f2ad6bbd65c98e74f9ecc41a4a32953bbde2aca02cbde3d8c3374efb06b1af7c"} Apr 23 09:32:27.726245 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.726130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" event={"ID":"566b4995-5609-43dd-8655-9d5459456f3e","Type":"ContainerStarted","Data":"068c376f7709cb08b75493545d141f536382879cd8b0ede9ddf9a8fee980a679"} Apr 23 09:32:27.726245 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.726143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" event={"ID":"566b4995-5609-43dd-8655-9d5459456f3e","Type":"ContainerStarted","Data":"76b5c026b46da5ba6f65ece4fc8c11b555b46ba0ff96d1eeea5b54f3a2b76f75"} Apr 23 09:32:27.742266 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.742218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-qtccq" podStartSLOduration=2.595552102 podStartE2EDuration="3.742204899s" podCreationTimestamp="2026-04-23 09:32:24 +0000 UTC" firstStartedPulling="2026-04-23 09:32:25.652164475 +0000 UTC m=+98.915551604" lastFinishedPulling="2026-04-23 09:32:26.798817244 +0000 UTC m=+100.062204401" observedRunningTime="2026-04-23 09:32:27.741449524 +0000 UTC m=+101.004836675" watchObservedRunningTime="2026-04-23 09:32:27.742204899 +0000 UTC m=+101.005592049" Apr 23 09:32:27.764974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:27.761499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vndrs" podStartSLOduration=3.001009794 podStartE2EDuration="3.761482023s" podCreationTimestamp="2026-04-23 09:32:24 +0000 UTC" firstStartedPulling="2026-04-23 09:32:25.509436752 +0000 UTC m=+98.772823880" lastFinishedPulling="2026-04-23 09:32:26.269908977 +0000 UTC m=+99.533296109" observedRunningTime="2026-04-23 09:32:27.760545277 +0000 UTC m=+101.023932428" watchObservedRunningTime="2026-04-23 09:32:27.761482023 +0000 UTC m=+101.024869174" Apr 23 09:32:28.731588 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:28.731545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" event={"ID":"566b4995-5609-43dd-8655-9d5459456f3e","Type":"ContainerStarted","Data":"baf26a0ca9d52f2bb9cd5defd91a11c22964c49e87ed6a2e5efad9617a125f18"} Apr 23 09:32:28.756748 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:28.756686 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fxpl2" podStartSLOduration=3.640472652 podStartE2EDuration="4.756667328s" podCreationTimestamp="2026-04-23 09:32:24 +0000 UTC" firstStartedPulling="2026-04-23 09:32:26.91982255 +0000 UTC m=+100.183209681" lastFinishedPulling="2026-04-23 09:32:28.036017226 +0000 UTC m=+101.299404357" observedRunningTime="2026-04-23 09:32:28.754906658 +0000 UTC m=+102.018293810" watchObservedRunningTime="2026-04-23 09:32:28.756667328 +0000 UTC m=+102.020054480" Apr 23 09:32:29.338309 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.338283 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb"] Apr 23 09:32:29.379185 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.379152 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb"] Apr 23 09:32:29.379372 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.379304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:29.383367 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.382592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 09:32:29.384636 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.383853 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-74bbb\"" Apr 23 09:32:29.495121 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.495093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e8e45c-0dda-4d4d-8f34-d004dce56258-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ztjrb\" (UID: \"d8e8e45c-0dda-4d4d-8f34-d004dce56258\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:29.596440 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.596408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e8e45c-0dda-4d4d-8f34-d004dce56258-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ztjrb\" (UID: \"d8e8e45c-0dda-4d4d-8f34-d004dce56258\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:29.598680 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.598659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e8e45c-0dda-4d4d-8f34-d004dce56258-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ztjrb\" (UID: \"d8e8e45c-0dda-4d4d-8f34-d004dce56258\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:29.697164 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.697140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:29.736212 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.736159 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"479a2ff7014bf9b495ae35dedd9a1045ee11222631f4da0ba41066f619a2ef58"} Apr 23 09:32:29.736651 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.736222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"74f4f58f758e3b6d6cf95c75b125d6bb2e69af0365d4eb54a758092c468f6aa5"} Apr 23 09:32:29.736651 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.736241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"22fee082e6343d42ef59c68d6d92c91d875f7d8b52c4779933a7d591e2d54818"} Apr 23 09:32:29.828587 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:29.828554 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb"] Apr 23 09:32:29.831351 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:32:29.831326 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e8e45c_0dda_4d4d_8f34_d004dce56258.slice/crio-a2d6ff00d2b262034d803878cc326e5a1edacac3be1e49c42ee4a390c18fd7a0 WatchSource:0}: Error finding container a2d6ff00d2b262034d803878cc326e5a1edacac3be1e49c42ee4a390c18fd7a0: Status 404 returned error can't find the container with id a2d6ff00d2b262034d803878cc326e5a1edacac3be1e49c42ee4a390c18fd7a0 Apr 23 09:32:30.740080 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.740035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" event={"ID":"d8e8e45c-0dda-4d4d-8f34-d004dce56258","Type":"ContainerStarted","Data":"a2d6ff00d2b262034d803878cc326e5a1edacac3be1e49c42ee4a390c18fd7a0"} Apr 23 09:32:30.765031 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.764998 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 09:32:30.785413 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.785381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 09:32:30.785581 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.785566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.788390 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788360 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 09:32:30.788543 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788412 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 09:32:30.788543 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 09:32:30.788543 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788469 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 09:32:30.788543 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788412 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-g8rtl\"" Apr 23 09:32:30.788757 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788548 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 09:32:30.788757 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 09:32:30.788858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8i950k1h2busn\"" Apr 23 09:32:30.788858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.788801 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 09:32:30.789484 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.789464 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 09:32:30.789600 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.789507 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 09:32:30.789600 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.789523 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 09:32:30.789600 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.789551 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 09:32:30.793743 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.793726 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 09:32:30.795655 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.795637 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 09:32:30.907854 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.907821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908007 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.907866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw6zk\" (UniqueName: \"kubernetes.io/projected/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-kube-api-access-dw6zk\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908007 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.907940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908007 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.907982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908119 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908119 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908119 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908230 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908230 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908294 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-config\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908294 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-config-out\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908355 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908355 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908419 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908419 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908480 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908511 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:30.908511 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:30.908502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-web-config\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009419 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009419 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009419 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-web-config\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009698 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009759 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw6zk\" (UniqueName: \"kubernetes.io/projected/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-kube-api-access-dw6zk\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009815 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009815 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.009920 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010114 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.009953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010114 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-config\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010114 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-config-out\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010114 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010337 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010337 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010337 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.010555 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.011013 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.011013 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.010981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.011858 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.011347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.012826 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.012516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.012826 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.012532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.012826 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.012700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-web-config\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.012826 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.012710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.015578 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.014069 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.015578 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.015406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.015578 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.015461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.015772 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.015642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.016162 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.016134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.016449 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.016425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.016864 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.016839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-config\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.018093 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.018062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.018496 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.018474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-config-out\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.021862 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.021841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw6zk\" (UniqueName: \"kubernetes.io/projected/2c1cf94a-acf7-4db9-91bd-fd78724c8bbb-kube-api-access-dw6zk\") pod \"prometheus-k8s-0\" (UID: \"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.107025 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.106984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:32:31.440313 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.440289 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 09:32:31.554138 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:32:31.554094 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1cf94a_acf7_4db9_91bd_fd78724c8bbb.slice/crio-22f586eaffe4b943978046ffd50508a5087d69072aeb81a7533ab5cd958fb84d WatchSource:0}: Error finding container 22f586eaffe4b943978046ffd50508a5087d69072aeb81a7533ab5cd958fb84d: Status 404 returned error can't find the container with id 22f586eaffe4b943978046ffd50508a5087d69072aeb81a7533ab5cd958fb84d Apr 23 09:32:31.743763 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.743733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"22f586eaffe4b943978046ffd50508a5087d69072aeb81a7533ab5cd958fb84d"} Apr 23 09:32:31.745060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.745038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" event={"ID":"d8e8e45c-0dda-4d4d-8f34-d004dce56258","Type":"ContainerStarted","Data":"dabf5261b7192d431fd40c350e6f5e533bee3909fa74620693870638d79d0d81"} Apr 23 09:32:31.745238 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.745217 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:31.747476 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.747455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"a7c45e4ef56b65be8a752636b16fbe12d6b64838b146b8857cae752d8711525a"} Apr 23 09:32:31.747563 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.747482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"275dd96cc3c34e402dcc8c4c09b6dcab62a86c76f477281acdb4c070ad5e24eb"} Apr 23 09:32:31.750552 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.750532 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" Apr 23 09:32:31.760871 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:31.760829 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ztjrb" podStartSLOduration=1.011296306 podStartE2EDuration="2.760817745s" podCreationTimestamp="2026-04-23 09:32:29 +0000 UTC" firstStartedPulling="2026-04-23 09:32:29.83320037 +0000 UTC m=+103.096587499" lastFinishedPulling="2026-04-23 09:32:31.582721796 +0000 UTC m=+104.846108938" observedRunningTime="2026-04-23 09:32:31.759850521 +0000 UTC m=+105.023237671" watchObservedRunningTime="2026-04-23 09:32:31.760817745 +0000 UTC m=+105.024204895" Apr 23 09:32:32.752694 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:32.752653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" event={"ID":"334e1be2-d4fe-48bc-8763-cb2bf329d81d","Type":"ContainerStarted","Data":"8a727cf382cec274f2588c6aed493c28499a6dbf6d9445ebca91b40701d4774e"} Apr 23 09:32:32.778666 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:32.778610 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" podStartSLOduration=2.528466282 podStartE2EDuration="6.778593867s" podCreationTimestamp="2026-04-23 09:32:26 +0000 UTC" firstStartedPulling="2026-04-23 09:32:27.059141469 +0000 UTC m=+100.322528597" lastFinishedPulling="2026-04-23 09:32:31.30926905 +0000 UTC m=+104.572656182" observedRunningTime="2026-04-23 09:32:32.777470785 +0000 UTC m=+106.040857938" watchObservedRunningTime="2026-04-23 09:32:32.778593867 +0000 UTC m=+106.041981017" Apr 23 09:32:33.756726 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:33.756689 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c1cf94a-acf7-4db9-91bd-fd78724c8bbb" containerID="2307cee465311be26d98696f7681d9c2e581030fbd220e530ae7eb007ea9c84d" exitCode=0 Apr 23 09:32:33.757147 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:33.756781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerDied","Data":"2307cee465311be26d98696f7681d9c2e581030fbd220e530ae7eb007ea9c84d"} Apr 23 09:32:33.757465 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:33.757439 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:34.766935 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:34.766874 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5f74d46c95-jtbt9" Apr 23 09:32:37.776337 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.776305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"55f271745b325e532cf283b627c818d566b2d51ab978a4e3446b23d3fa7cb056"} Apr 23 09:32:37.776337 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.776340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"2c7fa25758d26f9b799191507d0668feb649aab1f3d424e1458daeb6ed54da78"} Apr 23 09:32:37.776337 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.776350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"14a3f7e963ae88a2b50f0a2700d575bff9cddceb01243a1304aa4622fa3cf395"} Apr 23 09:32:37.776756 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.776358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"231a7da10994acf3390610ca01fc82aa2781c8db1115346e1fbb9c2b1b916741"} Apr 23 09:32:37.776756 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.776366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"0c273cdb46ef893b0a2b7684ba53ba7a6855ce09856bc8a34017d7b4dd0d7d59"} Apr 23 09:32:37.776756 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.776376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2c1cf94a-acf7-4db9-91bd-fd78724c8bbb","Type":"ContainerStarted","Data":"a0701906755c581f5c2f95526f376e90776ef528b2b83c8c6906bd5e03140e66"} Apr 23 09:32:37.809056 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:37.808994 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.584665389 podStartE2EDuration="7.808976087s" podCreationTimestamp="2026-04-23 09:32:30 +0000 UTC" firstStartedPulling="2026-04-23 09:32:31.578689722 +0000 UTC m=+104.842076856" lastFinishedPulling="2026-04-23 09:32:36.803000424 +0000 UTC m=+110.066387554" observedRunningTime="2026-04-23 09:32:37.805979603 +0000 UTC m=+111.069366752" watchObservedRunningTime="2026-04-23 09:32:37.808976087 +0000 UTC m=+111.072363237" Apr 23 09:32:41.107971 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:32:41.107932 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:33:31.107798 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:33:31.107706 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:33:31.126524 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:33:31.126493 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:33:31.945602 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:33:31.945577 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 09:34:10.875478 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.875440 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-69bds"] Apr 23 09:34:10.878635 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.878616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:10.881120 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.881096 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 09:34:10.887510 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.887478 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-69bds"] Apr 23 09:34:10.946508 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.946474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1cc257b2-c01f-4024-8519-71ce67221cd9-dbus\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:10.946677 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.946524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1cc257b2-c01f-4024-8519-71ce67221cd9-kubelet-config\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:10.946677 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:10.946582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1cc257b2-c01f-4024-8519-71ce67221cd9-original-pull-secret\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.047712 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.047677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1cc257b2-c01f-4024-8519-71ce67221cd9-dbus\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.047884 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.047730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1cc257b2-c01f-4024-8519-71ce67221cd9-kubelet-config\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.047884 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.047761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1cc257b2-c01f-4024-8519-71ce67221cd9-original-pull-secret\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.047884 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.047834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1cc257b2-c01f-4024-8519-71ce67221cd9-kubelet-config\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.048001 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.047885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1cc257b2-c01f-4024-8519-71ce67221cd9-dbus\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.050003 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.049981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1cc257b2-c01f-4024-8519-71ce67221cd9-original-pull-secret\") pod \"global-pull-secret-syncer-69bds\" (UID: \"1cc257b2-c01f-4024-8519-71ce67221cd9\") " pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.188448 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.188362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-69bds" Apr 23 09:34:11.304891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:11.304733 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-69bds"] Apr 23 09:34:11.307623 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:34:11.307589 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc257b2_c01f_4024_8519_71ce67221cd9.slice/crio-9617d3cd215d089497c66c91c0134eebaa3ea1d9298541d18f6697d32e7dd90d WatchSource:0}: Error finding container 9617d3cd215d089497c66c91c0134eebaa3ea1d9298541d18f6697d32e7dd90d: Status 404 returned error can't find the container with id 9617d3cd215d089497c66c91c0134eebaa3ea1d9298541d18f6697d32e7dd90d Apr 23 09:34:12.046487 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:12.046448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-69bds" event={"ID":"1cc257b2-c01f-4024-8519-71ce67221cd9","Type":"ContainerStarted","Data":"9617d3cd215d089497c66c91c0134eebaa3ea1d9298541d18f6697d32e7dd90d"} Apr 23 09:34:16.058742 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:16.058707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-69bds" event={"ID":"1cc257b2-c01f-4024-8519-71ce67221cd9","Type":"ContainerStarted","Data":"1e9ca799a81acdd902160b624503ea8fdbf928208d73f0003a762c47f2955912"} Apr 23 09:34:16.075076 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:34:16.075027 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-69bds" podStartSLOduration=2.11107892 podStartE2EDuration="6.075012704s" podCreationTimestamp="2026-04-23 09:34:10 +0000 UTC" firstStartedPulling="2026-04-23 09:34:11.309159549 +0000 UTC m=+204.572546676" lastFinishedPulling="2026-04-23 09:34:15.273093329 +0000 UTC m=+208.536480460" observedRunningTime="2026-04-23 09:34:16.072979476 +0000 UTC m=+209.336366625" watchObservedRunningTime="2026-04-23 09:34:16.075012704 +0000 UTC m=+209.338399854" Apr 23 09:35:32.220249 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.220214 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz"] Apr 23 09:35:32.223483 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.223459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.226131 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.226108 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:35:32.227273 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.227252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 09:35:32.227339 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.227285 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-wbbn5\"" Apr 23 09:35:32.230024 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.230000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz"] Apr 23 09:35:32.324263 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.324236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cd7095f-96a9-47eb-a770-b67fdfe65502-tmp\") pod \"jobset-operator-747c5859c7-4vqrz\" (UID: \"9cd7095f-96a9-47eb-a770-b67fdfe65502\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.324431 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.324267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw2t6\" (UniqueName: \"kubernetes.io/projected/9cd7095f-96a9-47eb-a770-b67fdfe65502-kube-api-access-zw2t6\") pod \"jobset-operator-747c5859c7-4vqrz\" (UID: \"9cd7095f-96a9-47eb-a770-b67fdfe65502\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.424626 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.424589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cd7095f-96a9-47eb-a770-b67fdfe65502-tmp\") pod \"jobset-operator-747c5859c7-4vqrz\" (UID: \"9cd7095f-96a9-47eb-a770-b67fdfe65502\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.424626 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.424624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw2t6\" (UniqueName: \"kubernetes.io/projected/9cd7095f-96a9-47eb-a770-b67fdfe65502-kube-api-access-zw2t6\") pod \"jobset-operator-747c5859c7-4vqrz\" (UID: \"9cd7095f-96a9-47eb-a770-b67fdfe65502\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.425023 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.425001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cd7095f-96a9-47eb-a770-b67fdfe65502-tmp\") pod \"jobset-operator-747c5859c7-4vqrz\" (UID: \"9cd7095f-96a9-47eb-a770-b67fdfe65502\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.432932 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.432906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw2t6\" (UniqueName: \"kubernetes.io/projected/9cd7095f-96a9-47eb-a770-b67fdfe65502-kube-api-access-zw2t6\") pod \"jobset-operator-747c5859c7-4vqrz\" (UID: \"9cd7095f-96a9-47eb-a770-b67fdfe65502\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.547044 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.547010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" Apr 23 09:35:32.665307 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:32.665233 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz"] Apr 23 09:35:32.667763 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:35:32.667722 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd7095f_96a9_47eb_a770_b67fdfe65502.slice/crio-60df39963e9c5a6c53ef34e553ff1e6377eff10e48d253c4d7ef835227c73b00 WatchSource:0}: Error finding container 60df39963e9c5a6c53ef34e553ff1e6377eff10e48d253c4d7ef835227c73b00: Status 404 returned error can't find the container with id 60df39963e9c5a6c53ef34e553ff1e6377eff10e48d253c4d7ef835227c73b00 Apr 23 09:35:33.270737 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:33.270699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" event={"ID":"9cd7095f-96a9-47eb-a770-b67fdfe65502","Type":"ContainerStarted","Data":"60df39963e9c5a6c53ef34e553ff1e6377eff10e48d253c4d7ef835227c73b00"} Apr 23 09:35:36.282133 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:36.282041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" event={"ID":"9cd7095f-96a9-47eb-a770-b67fdfe65502","Type":"ContainerStarted","Data":"ece88aa9892e0f3696d3799f005185b5f7b99912f5ca36bd8629771b4b7029bb"} Apr 23 09:35:36.297679 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:36.297623 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4vqrz" podStartSLOduration=1.166174494 podStartE2EDuration="4.297607772s" podCreationTimestamp="2026-04-23 09:35:32 +0000 UTC" firstStartedPulling="2026-04-23 09:35:32.669191301 +0000 UTC m=+285.932578440" lastFinishedPulling="2026-04-23 09:35:35.800624588 +0000 UTC m=+289.064011718" observedRunningTime="2026-04-23 09:35:36.296285995 +0000 UTC m=+289.559673145" watchObservedRunningTime="2026-04-23 09:35:36.297607772 +0000 UTC m=+289.560994921" Apr 23 09:35:47.263642 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:35:47.263611 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 09:36:01.628376 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.628331 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75"] Apr 23 09:36:01.631566 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.631551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.634292 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.634261 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 23 09:36:01.634415 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.634301 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 23 09:36:01.634489 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.634463 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 09:36:01.635452 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.635435 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 09:36:01.635543 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.635521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-7xg6n\"" Apr 23 09:36:01.638865 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.638566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75"] Apr 23 09:36:01.646408 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.646385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdwf7\" (UniqueName: \"kubernetes.io/projected/8987414a-01fe-4451-953e-901b5fbf04a5-kube-api-access-sdwf7\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.646504 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.646445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8987414a-01fe-4451-953e-901b5fbf04a5-cert\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.646504 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.646475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/8987414a-01fe-4451-953e-901b5fbf04a5-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.746909 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.746872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdwf7\" (UniqueName: \"kubernetes.io/projected/8987414a-01fe-4451-953e-901b5fbf04a5-kube-api-access-sdwf7\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.746909 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.746911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8987414a-01fe-4451-953e-901b5fbf04a5-cert\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.747152 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.746941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/8987414a-01fe-4451-953e-901b5fbf04a5-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.747652 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.747633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/8987414a-01fe-4451-953e-901b5fbf04a5-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.749333 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.749309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8987414a-01fe-4451-953e-901b5fbf04a5-cert\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.755148 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.755125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdwf7\" (UniqueName: \"kubernetes.io/projected/8987414a-01fe-4451-953e-901b5fbf04a5-kube-api-access-sdwf7\") pod \"kubeflow-trainer-controller-manager-78c9cf59d7-p7k75\" (UID: \"8987414a-01fe-4451-953e-901b5fbf04a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:01.941667 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:01.941572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:02.063556 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:02.063522 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75"] Apr 23 09:36:02.067335 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:36:02.067296 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8987414a_01fe_4451_953e_901b5fbf04a5.slice/crio-02cf4832fe1e7c974bf06cdfb31695f521c6dc1658981922e15405447d86417b WatchSource:0}: Error finding container 02cf4832fe1e7c974bf06cdfb31695f521c6dc1658981922e15405447d86417b: Status 404 returned error can't find the container with id 02cf4832fe1e7c974bf06cdfb31695f521c6dc1658981922e15405447d86417b Apr 23 09:36:02.069196 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:02.069162 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:36:02.359546 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:02.359510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" event={"ID":"8987414a-01fe-4451-953e-901b5fbf04a5","Type":"ContainerStarted","Data":"02cf4832fe1e7c974bf06cdfb31695f521c6dc1658981922e15405447d86417b"} Apr 23 09:36:05.369681 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:05.369642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" event={"ID":"8987414a-01fe-4451-953e-901b5fbf04a5","Type":"ContainerStarted","Data":"bab42ec96fd925b1039805edeb2416ca12f4cde9f2498fbcee45eca1a780654f"} Apr 23 09:36:05.370130 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:05.369692 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:36:05.385657 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:05.385608 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" podStartSLOduration=2.054762847 podStartE2EDuration="4.385593665s" podCreationTimestamp="2026-04-23 09:36:01 +0000 UTC" firstStartedPulling="2026-04-23 09:36:02.06933838 +0000 UTC m=+315.332725508" lastFinishedPulling="2026-04-23 09:36:04.400169193 +0000 UTC m=+317.663556326" observedRunningTime="2026-04-23 09:36:05.384082048 +0000 UTC m=+318.647469197" watchObservedRunningTime="2026-04-23 09:36:05.385593665 +0000 UTC m=+318.648980816" Apr 23 09:36:21.378398 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:36:21.378361 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-78c9cf59d7-p7k75" Apr 23 09:38:16.021300 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.021256 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996"] Apr 23 09:38:16.024698 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.024674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:38:16.027235 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.027202 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"openshift-service-ca.crt\"" Apr 23 09:38:16.027235 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.027202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"default-dockercfg-zg6nr\"" Apr 23 09:38:16.027409 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.027203 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"kube-root-ca.crt\"" Apr 23 09:38:16.032755 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.032733 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996"] Apr 23 09:38:16.077069 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.077034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h899s\" (UniqueName: \"kubernetes.io/projected/ff994a76-624b-4d05-ad73-c5273a4da1db-kube-api-access-h899s\") pod \"progression-enabled-node-0-0-dm996\" (UID: \"ff994a76-624b-4d05-ad73-c5273a4da1db\") " pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:38:16.177760 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.177717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h899s\" (UniqueName: \"kubernetes.io/projected/ff994a76-624b-4d05-ad73-c5273a4da1db-kube-api-access-h899s\") pod \"progression-enabled-node-0-0-dm996\" (UID: \"ff994a76-624b-4d05-ad73-c5273a4da1db\") " pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:38:16.185890 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.185866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h899s\" (UniqueName: \"kubernetes.io/projected/ff994a76-624b-4d05-ad73-c5273a4da1db-kube-api-access-h899s\") pod \"progression-enabled-node-0-0-dm996\" (UID: \"ff994a76-624b-4d05-ad73-c5273a4da1db\") " pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:38:16.335103 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.335072 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:38:16.458442 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.458409 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996"] Apr 23 09:38:16.461695 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:38:16.461668 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff994a76_624b_4d05_ad73_c5273a4da1db.slice/crio-197d7d80409e7bcffad23d77cdea5f16fb65389e4541a1040b01228e7550fd46 WatchSource:0}: Error finding container 197d7d80409e7bcffad23d77cdea5f16fb65389e4541a1040b01228e7550fd46: Status 404 returned error can't find the container with id 197d7d80409e7bcffad23d77cdea5f16fb65389e4541a1040b01228e7550fd46 Apr 23 09:38:16.745197 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:38:16.745128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" event={"ID":"ff994a76-624b-4d05-ad73-c5273a4da1db","Type":"ContainerStarted","Data":"197d7d80409e7bcffad23d77cdea5f16fb65389e4541a1040b01228e7550fd46"} Apr 23 09:40:07.095541 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:07.095505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" event={"ID":"ff994a76-624b-4d05-ad73-c5273a4da1db","Type":"ContainerStarted","Data":"beafcaaea58ca533b5542dc6297277d48bc8d30e176c944598fea90bad169033"} Apr 23 09:40:07.095963 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:07.095616 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:40:07.120751 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:07.120684 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" podStartSLOduration=2.316655453 podStartE2EDuration="1m52.120669877s" podCreationTimestamp="2026-04-23 09:38:15 +0000 UTC" firstStartedPulling="2026-04-23 09:38:16.463797705 +0000 UTC m=+449.727184833" lastFinishedPulling="2026-04-23 09:40:06.267812129 +0000 UTC m=+559.531199257" observedRunningTime="2026-04-23 09:40:07.119210634 +0000 UTC m=+560.382597785" watchObservedRunningTime="2026-04-23 09:40:07.120669877 +0000 UTC m=+560.384057026" Apr 23 09:40:08.098104 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:08.098069 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:40:30.096510 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:30.096465 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" podUID="ff994a76-624b-4d05-ad73-c5273a4da1db" containerName="node" probeResult="failure" output="Get \"http://10.134.0.19:28080/metrics\": dial tcp 10.134.0.19:28080: connect: connection refused" Apr 23 09:40:30.162558 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:30.162524 2575 generic.go:358] "Generic (PLEG): container finished" podID="ff994a76-624b-4d05-ad73-c5273a4da1db" containerID="beafcaaea58ca533b5542dc6297277d48bc8d30e176c944598fea90bad169033" exitCode=0 Apr 23 09:40:30.162707 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:30.162597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" event={"ID":"ff994a76-624b-4d05-ad73-c5273a4da1db","Type":"ContainerDied","Data":"beafcaaea58ca533b5542dc6297277d48bc8d30e176c944598fea90bad169033"} Apr 23 09:40:31.295148 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:31.295125 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:40:31.324786 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:31.324762 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h899s\" (UniqueName: \"kubernetes.io/projected/ff994a76-624b-4d05-ad73-c5273a4da1db-kube-api-access-h899s\") pod \"ff994a76-624b-4d05-ad73-c5273a4da1db\" (UID: \"ff994a76-624b-4d05-ad73-c5273a4da1db\") " Apr 23 09:40:31.326933 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:31.326894 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff994a76-624b-4d05-ad73-c5273a4da1db-kube-api-access-h899s" (OuterVolumeSpecName: "kube-api-access-h899s") pod "ff994a76-624b-4d05-ad73-c5273a4da1db" (UID: "ff994a76-624b-4d05-ad73-c5273a4da1db"). InnerVolumeSpecName "kube-api-access-h899s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:40:31.425881 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:31.425795 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h899s\" (UniqueName: \"kubernetes.io/projected/ff994a76-624b-4d05-ad73-c5273a4da1db-kube-api-access-h899s\") on node \"ip-10-0-129-154.ec2.internal\" DevicePath \"\"" Apr 23 09:40:32.169164 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:32.169128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" event={"ID":"ff994a76-624b-4d05-ad73-c5273a4da1db","Type":"ContainerDied","Data":"197d7d80409e7bcffad23d77cdea5f16fb65389e4541a1040b01228e7550fd46"} Apr 23 09:40:32.169164 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:32.169156 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996" Apr 23 09:40:32.169424 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:32.169160 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="197d7d80409e7bcffad23d77cdea5f16fb65389e4541a1040b01228e7550fd46" Apr 23 09:40:34.505922 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.505884 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6"] Apr 23 09:40:34.506314 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.506262 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff994a76-624b-4d05-ad73-c5273a4da1db" containerName="node" Apr 23 09:40:34.506314 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.506276 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff994a76-624b-4d05-ad73-c5273a4da1db" containerName="node" Apr 23 09:40:34.506381 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.506329 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff994a76-624b-4d05-ad73-c5273a4da1db" containerName="node" Apr 23 09:40:34.525946 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.525915 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6"] Apr 23 09:40:34.526107 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.526026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:34.528825 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.528798 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"default-dockercfg-zg6nr\"" Apr 23 09:40:34.529070 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.528808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"kube-root-ca.crt\"" Apr 23 09:40:34.529165 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.528808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"openshift-service-ca.crt\"" Apr 23 09:40:34.553386 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.553335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76whd\" (UniqueName: \"kubernetes.io/projected/6ac434eb-364f-4c4e-8470-991e3747d7a4-kube-api-access-76whd\") pod \"progression-disabled-node-0-0-nz7m6\" (UID: \"6ac434eb-364f-4c4e-8470-991e3747d7a4\") " pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:34.654648 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.654611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76whd\" (UniqueName: \"kubernetes.io/projected/6ac434eb-364f-4c4e-8470-991e3747d7a4-kube-api-access-76whd\") pod \"progression-disabled-node-0-0-nz7m6\" (UID: \"6ac434eb-364f-4c4e-8470-991e3747d7a4\") " pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:34.663282 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.663253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76whd\" (UniqueName: \"kubernetes.io/projected/6ac434eb-364f-4c4e-8470-991e3747d7a4-kube-api-access-76whd\") pod \"progression-disabled-node-0-0-nz7m6\" (UID: \"6ac434eb-364f-4c4e-8470-991e3747d7a4\") " pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:34.842253 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.842216 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:34.964238 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:34.964205 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6"] Apr 23 09:40:34.967748 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:40:34.967714 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac434eb_364f_4c4e_8470_991e3747d7a4.slice/crio-1bf6d6a47c73ea0ac90a2c192de976e6bd0e9a8ea92a50d9c82d6b681c3d0e75 WatchSource:0}: Error finding container 1bf6d6a47c73ea0ac90a2c192de976e6bd0e9a8ea92a50d9c82d6b681c3d0e75: Status 404 returned error can't find the container with id 1bf6d6a47c73ea0ac90a2c192de976e6bd0e9a8ea92a50d9c82d6b681c3d0e75 Apr 23 09:40:35.180188 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:35.180132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" event={"ID":"6ac434eb-364f-4c4e-8470-991e3747d7a4","Type":"ContainerStarted","Data":"15fcf348ec5a4013f161e4336e5a0fcb4399404e3a2b725edb58a20381e008d9"} Apr 23 09:40:35.180188 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:35.180169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" event={"ID":"6ac434eb-364f-4c4e-8470-991e3747d7a4","Type":"ContainerStarted","Data":"1bf6d6a47c73ea0ac90a2c192de976e6bd0e9a8ea92a50d9c82d6b681c3d0e75"} Apr 23 09:40:35.180430 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:35.180301 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:35.200017 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:35.199950 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" podStartSLOduration=1.199935302 podStartE2EDuration="1.199935302s" podCreationTimestamp="2026-04-23 09:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:40:35.197537815 +0000 UTC m=+588.460924992" watchObservedRunningTime="2026-04-23 09:40:35.199935302 +0000 UTC m=+588.463322534" Apr 23 09:40:37.186027 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:37.185996 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:58.223301 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:58.223255 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" podUID="6ac434eb-364f-4c4e-8470-991e3747d7a4" containerName="node" probeResult="failure" output="Get \"http://10.134.0.20:28080/metrics\": read tcp 10.134.0.2:45952->10.134.0.20:28080: read: connection reset by peer" Apr 23 09:40:58.252937 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:58.252905 2575 generic.go:358] "Generic (PLEG): container finished" podID="6ac434eb-364f-4c4e-8470-991e3747d7a4" containerID="15fcf348ec5a4013f161e4336e5a0fcb4399404e3a2b725edb58a20381e008d9" exitCode=0 Apr 23 09:40:58.253060 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:58.252963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" event={"ID":"6ac434eb-364f-4c4e-8470-991e3747d7a4","Type":"ContainerDied","Data":"15fcf348ec5a4013f161e4336e5a0fcb4399404e3a2b725edb58a20381e008d9"} Apr 23 09:40:59.390585 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:59.390560 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:40:59.565623 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:59.565594 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76whd\" (UniqueName: \"kubernetes.io/projected/6ac434eb-364f-4c4e-8470-991e3747d7a4-kube-api-access-76whd\") pod \"6ac434eb-364f-4c4e-8470-991e3747d7a4\" (UID: \"6ac434eb-364f-4c4e-8470-991e3747d7a4\") " Apr 23 09:40:59.567629 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:59.567600 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac434eb-364f-4c4e-8470-991e3747d7a4-kube-api-access-76whd" (OuterVolumeSpecName: "kube-api-access-76whd") pod "6ac434eb-364f-4c4e-8470-991e3747d7a4" (UID: "6ac434eb-364f-4c4e-8470-991e3747d7a4"). InnerVolumeSpecName "kube-api-access-76whd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:40:59.666418 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:40:59.666380 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76whd\" (UniqueName: \"kubernetes.io/projected/6ac434eb-364f-4c4e-8470-991e3747d7a4-kube-api-access-76whd\") on node \"ip-10-0-129-154.ec2.internal\" DevicePath \"\"" Apr 23 09:41:00.259488 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:00.259460 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" Apr 23 09:41:00.259652 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:00.259462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6" event={"ID":"6ac434eb-364f-4c4e-8470-991e3747d7a4","Type":"ContainerDied","Data":"1bf6d6a47c73ea0ac90a2c192de976e6bd0e9a8ea92a50d9c82d6b681c3d0e75"} Apr 23 09:41:00.259652 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:00.259566 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bf6d6a47c73ea0ac90a2c192de976e6bd0e9a8ea92a50d9c82d6b681c3d0e75" Apr 23 09:41:09.499849 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.499814 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm"] Apr 23 09:41:09.502391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.500161 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ac434eb-364f-4c4e-8470-991e3747d7a4" containerName="node" Apr 23 09:41:09.502391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.500186 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac434eb-364f-4c4e-8470-991e3747d7a4" containerName="node" Apr 23 09:41:09.502391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.500241 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ac434eb-364f-4c4e-8470-991e3747d7a4" containerName="node" Apr 23 09:41:09.503326 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.503310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:09.505759 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.505731 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"kube-root-ca.crt\"" Apr 23 09:41:09.505891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.505785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"openshift-service-ca.crt\"" Apr 23 09:41:09.505891 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.505864 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"default-dockercfg-zg6nr\"" Apr 23 09:41:09.512480 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.512453 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm"] Apr 23 09:41:09.552974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.552940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrcpj\" (UniqueName: \"kubernetes.io/projected/94dee74a-600a-4d85-a8d2-6927ce9183eb-kube-api-access-lrcpj\") pod \"progression-invalid-node-0-0-jrpqm\" (UID: \"94dee74a-600a-4d85-a8d2-6927ce9183eb\") " pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:09.653603 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.653565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrcpj\" (UniqueName: \"kubernetes.io/projected/94dee74a-600a-4d85-a8d2-6927ce9183eb-kube-api-access-lrcpj\") pod \"progression-invalid-node-0-0-jrpqm\" (UID: \"94dee74a-600a-4d85-a8d2-6927ce9183eb\") " pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:09.661729 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.661697 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrcpj\" (UniqueName: \"kubernetes.io/projected/94dee74a-600a-4d85-a8d2-6927ce9183eb-kube-api-access-lrcpj\") pod \"progression-invalid-node-0-0-jrpqm\" (UID: \"94dee74a-600a-4d85-a8d2-6927ce9183eb\") " pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:09.813633 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.813591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:09.934585 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.934444 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm"] Apr 23 09:41:09.937246 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:41:09.937211 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94dee74a_600a_4d85_a8d2_6927ce9183eb.slice/crio-21fa02762f6545bed809c8a76b9eb8f0c1bfcbf3880684a10718c8f174e911a7 WatchSource:0}: Error finding container 21fa02762f6545bed809c8a76b9eb8f0c1bfcbf3880684a10718c8f174e911a7: Status 404 returned error can't find the container with id 21fa02762f6545bed809c8a76b9eb8f0c1bfcbf3880684a10718c8f174e911a7 Apr 23 09:41:09.939191 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:09.939164 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:41:10.295257 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:10.295215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" event={"ID":"94dee74a-600a-4d85-a8d2-6927ce9183eb","Type":"ContainerStarted","Data":"83ad212b2ca61c4b3980e3688d3dd709727fb143135b678d44e64bae660e13f0"} Apr 23 09:41:10.295257 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:10.295260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" event={"ID":"94dee74a-600a-4d85-a8d2-6927ce9183eb","Type":"ContainerStarted","Data":"21fa02762f6545bed809c8a76b9eb8f0c1bfcbf3880684a10718c8f174e911a7"} Apr 23 09:41:10.295466 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:10.295298 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:10.312943 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:10.312877 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" podStartSLOduration=1.3128593020000001 podStartE2EDuration="1.312859302s" podCreationTimestamp="2026-04-23 09:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:41:10.311075709 +0000 UTC m=+623.574462859" watchObservedRunningTime="2026-04-23 09:41:10.312859302 +0000 UTC m=+623.576246454" Apr 23 09:41:11.297105 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:11.297068 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.21:28080/metrics\": dial tcp 10.134.0.21:28080: connect: connection refused" Apr 23 09:41:11.298652 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:11.298619 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.21:28080/metrics\": dial tcp 10.134.0.21:28080: connect: connection refused" Apr 23 09:41:12.301123 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:12.301089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:33.299839 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:33.299787 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.21:28080/metrics\": dial tcp 10.134.0.21:28080: connect: connection refused" Apr 23 09:41:33.373236 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:33.373151 2575 generic.go:358] "Generic (PLEG): container finished" podID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerID="83ad212b2ca61c4b3980e3688d3dd709727fb143135b678d44e64bae660e13f0" exitCode=0 Apr 23 09:41:33.373381 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:33.373227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" event={"ID":"94dee74a-600a-4d85-a8d2-6927ce9183eb","Type":"ContainerDied","Data":"83ad212b2ca61c4b3980e3688d3dd709727fb143135b678d44e64bae660e13f0"} Apr 23 09:41:34.498787 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:34.498765 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:41:34.563233 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:34.563200 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrcpj\" (UniqueName: \"kubernetes.io/projected/94dee74a-600a-4d85-a8d2-6927ce9183eb-kube-api-access-lrcpj\") pod \"94dee74a-600a-4d85-a8d2-6927ce9183eb\" (UID: \"94dee74a-600a-4d85-a8d2-6927ce9183eb\") " Apr 23 09:41:34.565223 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:34.565171 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94dee74a-600a-4d85-a8d2-6927ce9183eb-kube-api-access-lrcpj" (OuterVolumeSpecName: "kube-api-access-lrcpj") pod "94dee74a-600a-4d85-a8d2-6927ce9183eb" (UID: "94dee74a-600a-4d85-a8d2-6927ce9183eb"). InnerVolumeSpecName "kube-api-access-lrcpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:41:34.664560 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:34.664464 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrcpj\" (UniqueName: \"kubernetes.io/projected/94dee74a-600a-4d85-a8d2-6927ce9183eb-kube-api-access-lrcpj\") on node \"ip-10-0-129-154.ec2.internal\" DevicePath \"\"" Apr 23 09:41:35.380520 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:35.380491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" event={"ID":"94dee74a-600a-4d85-a8d2-6927ce9183eb","Type":"ContainerDied","Data":"21fa02762f6545bed809c8a76b9eb8f0c1bfcbf3880684a10718c8f174e911a7"} Apr 23 09:41:35.380520 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:35.380529 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21fa02762f6545bed809c8a76b9eb8f0c1bfcbf3880684a10718c8f174e911a7" Apr 23 09:41:35.380742 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:41:35.380501 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm" Apr 23 09:43:25.345299 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.345260 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs"] Apr 23 09:43:25.345769 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.345577 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerName="node" Apr 23 09:43:25.345769 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.345589 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerName="node" Apr 23 09:43:25.345769 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.345666 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" containerName="node" Apr 23 09:43:25.348627 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.348610 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:25.351662 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.351638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"default-dockercfg-zg6nr\"" Apr 23 09:43:25.351965 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.351952 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"openshift-service-ca.crt\"" Apr 23 09:43:25.356864 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.356843 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"kube-root-ca.crt\"" Apr 23 09:43:25.380236 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.380207 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs"] Apr 23 09:43:25.484213 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.484160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfm2\" (UniqueName: \"kubernetes.io/projected/532c027e-a8e5-4e77-b8ee-e8e15babc76e-kube-api-access-mrfm2\") pod \"progression-no-metrics-node-0-0-7djqs\" (UID: \"532c027e-a8e5-4e77-b8ee-e8e15babc76e\") " pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:25.585054 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.585012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfm2\" (UniqueName: \"kubernetes.io/projected/532c027e-a8e5-4e77-b8ee-e8e15babc76e-kube-api-access-mrfm2\") pod \"progression-no-metrics-node-0-0-7djqs\" (UID: \"532c027e-a8e5-4e77-b8ee-e8e15babc76e\") " pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:25.607286 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.607228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfm2\" (UniqueName: \"kubernetes.io/projected/532c027e-a8e5-4e77-b8ee-e8e15babc76e-kube-api-access-mrfm2\") pod \"progression-no-metrics-node-0-0-7djqs\" (UID: \"532c027e-a8e5-4e77-b8ee-e8e15babc76e\") " pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:25.658911 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.658878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:25.787959 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:25.787929 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs"] Apr 23 09:43:25.789758 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:43:25.789731 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod532c027e_a8e5_4e77_b8ee_e8e15babc76e.slice/crio-af3ed18abc0298747b8724d9a1e583971a82fea38ebdbf2994bd0ac4260b7e28 WatchSource:0}: Error finding container af3ed18abc0298747b8724d9a1e583971a82fea38ebdbf2994bd0ac4260b7e28: Status 404 returned error can't find the container with id af3ed18abc0298747b8724d9a1e583971a82fea38ebdbf2994bd0ac4260b7e28 Apr 23 09:43:26.762644 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:26.762604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" event={"ID":"532c027e-a8e5-4e77-b8ee-e8e15babc76e","Type":"ContainerStarted","Data":"f464633cb74be997986908cb8a18fd126788a8edf913488da9c2d9838a3314a6"} Apr 23 09:43:26.763013 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:26.762650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" event={"ID":"532c027e-a8e5-4e77-b8ee-e8e15babc76e","Type":"ContainerStarted","Data":"af3ed18abc0298747b8724d9a1e583971a82fea38ebdbf2994bd0ac4260b7e28"} Apr 23 09:43:26.781326 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:26.781271 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" podStartSLOduration=1.781232787 podStartE2EDuration="1.781232787s" podCreationTimestamp="2026-04-23 09:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:43:26.779034989 +0000 UTC m=+760.042422140" watchObservedRunningTime="2026-04-23 09:43:26.781232787 +0000 UTC m=+760.044619938" Apr 23 09:43:31.779347 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:31.779309 2575 generic.go:358] "Generic (PLEG): container finished" podID="532c027e-a8e5-4e77-b8ee-e8e15babc76e" containerID="f464633cb74be997986908cb8a18fd126788a8edf913488da9c2d9838a3314a6" exitCode=0 Apr 23 09:43:31.779710 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:31.779381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" event={"ID":"532c027e-a8e5-4e77-b8ee-e8e15babc76e","Type":"ContainerDied","Data":"f464633cb74be997986908cb8a18fd126788a8edf913488da9c2d9838a3314a6"} Apr 23 09:43:32.901811 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:32.901789 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:33.046993 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:33.046964 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrfm2\" (UniqueName: \"kubernetes.io/projected/532c027e-a8e5-4e77-b8ee-e8e15babc76e-kube-api-access-mrfm2\") pod \"532c027e-a8e5-4e77-b8ee-e8e15babc76e\" (UID: \"532c027e-a8e5-4e77-b8ee-e8e15babc76e\") " Apr 23 09:43:33.049027 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:33.048990 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532c027e-a8e5-4e77-b8ee-e8e15babc76e-kube-api-access-mrfm2" (OuterVolumeSpecName: "kube-api-access-mrfm2") pod "532c027e-a8e5-4e77-b8ee-e8e15babc76e" (UID: "532c027e-a8e5-4e77-b8ee-e8e15babc76e"). InnerVolumeSpecName "kube-api-access-mrfm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:43:33.148199 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:33.148146 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrfm2\" (UniqueName: \"kubernetes.io/projected/532c027e-a8e5-4e77-b8ee-e8e15babc76e-kube-api-access-mrfm2\") on node \"ip-10-0-129-154.ec2.internal\" DevicePath \"\"" Apr 23 09:43:33.787623 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:33.787537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" event={"ID":"532c027e-a8e5-4e77-b8ee-e8e15babc76e","Type":"ContainerDied","Data":"af3ed18abc0298747b8724d9a1e583971a82fea38ebdbf2994bd0ac4260b7e28"} Apr 23 09:43:33.787623 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:33.787574 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af3ed18abc0298747b8724d9a1e583971a82fea38ebdbf2994bd0ac4260b7e28" Apr 23 09:43:33.787623 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:33.787585 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs" Apr 23 09:43:38.240479 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.240443 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s7ptq/must-gather-j5wdf"] Apr 23 09:43:38.240860 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.240769 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="532c027e-a8e5-4e77-b8ee-e8e15babc76e" containerName="node" Apr 23 09:43:38.240860 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.240780 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="532c027e-a8e5-4e77-b8ee-e8e15babc76e" containerName="node" Apr 23 09:43:38.240860 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.240842 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="532c027e-a8e5-4e77-b8ee-e8e15babc76e" containerName="node" Apr 23 09:43:38.244028 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.244006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.247262 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.247242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s7ptq\"/\"kube-root-ca.crt\"" Apr 23 09:43:38.248500 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.248482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s7ptq\"/\"default-dockercfg-mvbt4\"" Apr 23 09:43:38.248584 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.248483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s7ptq\"/\"openshift-service-ca.crt\"" Apr 23 09:43:38.257091 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.257069 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s7ptq/must-gather-j5wdf"] Apr 23 09:43:38.393945 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.393901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wtm\" (UniqueName: \"kubernetes.io/projected/05b17baa-b19e-4281-83a6-ececa9ab3692-kube-api-access-77wtm\") pod \"must-gather-j5wdf\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.393945 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.393951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05b17baa-b19e-4281-83a6-ececa9ab3692-must-gather-output\") pod \"must-gather-j5wdf\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.495237 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.495122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77wtm\" (UniqueName: \"kubernetes.io/projected/05b17baa-b19e-4281-83a6-ececa9ab3692-kube-api-access-77wtm\") pod \"must-gather-j5wdf\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.495237 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.495165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05b17baa-b19e-4281-83a6-ececa9ab3692-must-gather-output\") pod \"must-gather-j5wdf\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.495489 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.495472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05b17baa-b19e-4281-83a6-ececa9ab3692-must-gather-output\") pod \"must-gather-j5wdf\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.504898 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.504877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wtm\" (UniqueName: \"kubernetes.io/projected/05b17baa-b19e-4281-83a6-ececa9ab3692-kube-api-access-77wtm\") pod \"must-gather-j5wdf\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.553003 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.552964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:43:38.680518 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.678622 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s7ptq/must-gather-j5wdf"] Apr 23 09:43:38.805206 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:38.805147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" event={"ID":"05b17baa-b19e-4281-83a6-ececa9ab3692","Type":"ContainerStarted","Data":"53fcb8305e09f5481df49b6a8e05b674e80885edf3a020f34da0b5edb56f7e17"} Apr 23 09:43:42.233444 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.233397 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6"] Apr 23 09:43:42.247980 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.247945 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-disabled-node-0-0-nz7m6"] Apr 23 09:43:42.253139 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.253111 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996"] Apr 23 09:43:42.257776 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.257750 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-enabled-node-0-0-dm996"] Apr 23 09:43:42.267235 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.267209 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm"] Apr 23 09:43:42.273916 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.273884 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-invalid-node-0-0-jrpqm"] Apr 23 09:43:42.292000 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.291966 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs"] Apr 23 09:43:42.300968 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:42.300938 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-no-metrics-node-0-0-7djqs"] Apr 23 09:43:43.357992 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.357956 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532c027e-a8e5-4e77-b8ee-e8e15babc76e" path="/var/lib/kubelet/pods/532c027e-a8e5-4e77-b8ee-e8e15babc76e/volumes" Apr 23 09:43:43.358505 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.358482 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac434eb-364f-4c4e-8470-991e3747d7a4" path="/var/lib/kubelet/pods/6ac434eb-364f-4c4e-8470-991e3747d7a4/volumes" Apr 23 09:43:43.358893 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.358870 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94dee74a-600a-4d85-a8d2-6927ce9183eb" path="/var/lib/kubelet/pods/94dee74a-600a-4d85-a8d2-6927ce9183eb/volumes" Apr 23 09:43:43.359291 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.359273 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff994a76-624b-4d05-ad73-c5273a4da1db" path="/var/lib/kubelet/pods/ff994a76-624b-4d05-ad73-c5273a4da1db/volumes" Apr 23 09:43:43.826217 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.826109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" event={"ID":"05b17baa-b19e-4281-83a6-ececa9ab3692","Type":"ContainerStarted","Data":"0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45"} Apr 23 09:43:43.826217 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.826146 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" event={"ID":"05b17baa-b19e-4281-83a6-ececa9ab3692","Type":"ContainerStarted","Data":"caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742"} Apr 23 09:43:43.843835 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:43.843781 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" podStartSLOduration=1.2093874709999999 podStartE2EDuration="5.843766037s" podCreationTimestamp="2026-04-23 09:43:38 +0000 UTC" firstStartedPulling="2026-04-23 09:43:38.684998156 +0000 UTC m=+771.948385286" lastFinishedPulling="2026-04-23 09:43:43.319376706 +0000 UTC m=+776.582763852" observedRunningTime="2026-04-23 09:43:43.842360728 +0000 UTC m=+777.105747879" watchObservedRunningTime="2026-04-23 09:43:43.843766037 +0000 UTC m=+777.107153225" Apr 23 09:43:53.505545 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:53.505500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-78c9cf59d7-p7k75_8987414a-01fe-4451-953e-901b5fbf04a5/manager/0.log" Apr 23 09:43:53.979124 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:53.979087 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-78c9cf59d7-p7k75_8987414a-01fe-4451-953e-901b5fbf04a5/manager/0.log" Apr 23 09:43:54.455592 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:43:54.455539 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-78c9cf59d7-p7k75_8987414a-01fe-4451-953e-901b5fbf04a5/manager/0.log" Apr 23 09:44:30.002366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:30.002260 2575 generic.go:358] "Generic (PLEG): container finished" podID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerID="caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742" exitCode=0 Apr 23 09:44:30.002366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:30.002341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" event={"ID":"05b17baa-b19e-4281-83a6-ececa9ab3692","Type":"ContainerDied","Data":"caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742"} Apr 23 09:44:30.002824 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:30.002677 2575 scope.go:117] "RemoveContainer" containerID="caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742" Apr 23 09:44:30.745244 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:30.745213 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s7ptq_must-gather-j5wdf_05b17baa-b19e-4281-83a6-ececa9ab3692/gather/0.log" Apr 23 09:44:34.270999 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:34.270961 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-69bds_1cc257b2-c01f-4024-8519-71ce67221cd9/global-pull-secret-syncer/0.log" Apr 23 09:44:34.415089 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:34.415061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p669t_4922b0b7-cbb6-414b-9c1f-b71799a538cf/konnectivity-agent/0.log" Apr 23 09:44:34.439035 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:34.439009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-154.ec2.internal_c6d06d93aa42ede6e1ca1a00d0b49d7a/haproxy/0.log" Apr 23 09:44:36.105014 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.104978 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s7ptq/must-gather-j5wdf"] Apr 23 09:44:36.105451 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.105226 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="copy" containerID="cri-o://0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45" gracePeriod=2 Apr 23 09:44:36.112739 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.112709 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s7ptq/must-gather-j5wdf"] Apr 23 09:44:36.340702 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.340680 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s7ptq_must-gather-j5wdf_05b17baa-b19e-4281-83a6-ececa9ab3692/copy/0.log" Apr 23 09:44:36.341045 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.341026 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:44:36.343156 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.343132 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" err="pods \"must-gather-j5wdf\" is forbidden: User \"system:node:ip-10-0-129-154.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s7ptq\": no relationship found between node 'ip-10-0-129-154.ec2.internal' and this object" Apr 23 09:44:36.407834 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.407743 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05b17baa-b19e-4281-83a6-ececa9ab3692-must-gather-output\") pod \"05b17baa-b19e-4281-83a6-ececa9ab3692\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " Apr 23 09:44:36.407834 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.407813 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77wtm\" (UniqueName: \"kubernetes.io/projected/05b17baa-b19e-4281-83a6-ececa9ab3692-kube-api-access-77wtm\") pod \"05b17baa-b19e-4281-83a6-ececa9ab3692\" (UID: \"05b17baa-b19e-4281-83a6-ececa9ab3692\") " Apr 23 09:44:36.410003 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.409969 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b17baa-b19e-4281-83a6-ececa9ab3692-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "05b17baa-b19e-4281-83a6-ececa9ab3692" (UID: "05b17baa-b19e-4281-83a6-ececa9ab3692"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:44:36.410126 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.410020 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b17baa-b19e-4281-83a6-ececa9ab3692-kube-api-access-77wtm" (OuterVolumeSpecName: "kube-api-access-77wtm") pod "05b17baa-b19e-4281-83a6-ececa9ab3692" (UID: "05b17baa-b19e-4281-83a6-ececa9ab3692"). InnerVolumeSpecName "kube-api-access-77wtm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:44:36.508320 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.508279 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-77wtm\" (UniqueName: \"kubernetes.io/projected/05b17baa-b19e-4281-83a6-ececa9ab3692-kube-api-access-77wtm\") on node \"ip-10-0-129-154.ec2.internal\" DevicePath \"\"" Apr 23 09:44:36.508320 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:36.508309 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05b17baa-b19e-4281-83a6-ececa9ab3692-must-gather-output\") on node \"ip-10-0-129-154.ec2.internal\" DevicePath \"\"" Apr 23 09:44:37.025000 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.024970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s7ptq_must-gather-j5wdf_05b17baa-b19e-4281-83a6-ececa9ab3692/copy/0.log" Apr 23 09:44:37.025391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.025356 2575 generic.go:358] "Generic (PLEG): container finished" podID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerID="0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45" exitCode=143 Apr 23 09:44:37.025530 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.025418 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" Apr 23 09:44:37.025530 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.025458 2575 scope.go:117] "RemoveContainer" containerID="0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45" Apr 23 09:44:37.027875 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.027840 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" err="pods \"must-gather-j5wdf\" is forbidden: User \"system:node:ip-10-0-129-154.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s7ptq\": no relationship found between node 'ip-10-0-129-154.ec2.internal' and this object" Apr 23 09:44:37.033701 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.033683 2575 scope.go:117] "RemoveContainer" containerID="caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742" Apr 23 09:44:37.035565 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.035544 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" err="pods \"must-gather-j5wdf\" is forbidden: User \"system:node:ip-10-0-129-154.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s7ptq\": no relationship found between node 'ip-10-0-129-154.ec2.internal' and this object" Apr 23 09:44:37.045132 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.045115 2575 scope.go:117] "RemoveContainer" containerID="0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45" Apr 23 09:44:37.045401 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:44:37.045379 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45\": container with ID starting with 0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45 not found: ID does not exist" containerID="0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45" Apr 23 09:44:37.045482 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.045417 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45"} err="failed to get container status \"0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45\": rpc error: code = NotFound desc = could not find container \"0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45\": container with ID starting with 0e2ffa26c9caddd1afeb222cac5f15c490d25253991980f0b5646582bcc72e45 not found: ID does not exist" Apr 23 09:44:37.045482 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.045466 2575 scope.go:117] "RemoveContainer" containerID="caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742" Apr 23 09:44:37.045708 ip-10-0-129-154 kubenswrapper[2575]: E0423 09:44:37.045688 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742\": container with ID starting with caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742 not found: ID does not exist" containerID="caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742" Apr 23 09:44:37.045749 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.045715 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742"} err="failed to get container status \"caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742\": rpc error: code = NotFound desc = could not find container \"caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742\": container with ID starting with caed18b0d01cedd8c1c79abf0ffba0936e050b468661074c5ff5a660644c1742 not found: ID does not exist" Apr 23 09:44:37.357103 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.357070 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" pod="openshift-must-gather-s7ptq/must-gather-j5wdf" err="pods \"must-gather-j5wdf\" is forbidden: User \"system:node:ip-10-0-129-154.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s7ptq\": no relationship found between node 'ip-10-0-129-154.ec2.internal' and this object" Apr 23 09:44:37.357723 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.357703 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" path="/var/lib/kubelet/pods/05b17baa-b19e-4281-83a6-ececa9ab3692/volumes" Apr 23 09:44:37.544382 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.544355 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qtccq_44f52516-aac2-4cb9-8344-3113b9705130/kube-state-metrics/0.log" Apr 23 09:44:37.565472 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.565440 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qtccq_44f52516-aac2-4cb9-8344-3113b9705130/kube-rbac-proxy-main/0.log" Apr 23 09:44:37.592599 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.592563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qtccq_44f52516-aac2-4cb9-8344-3113b9705130/kube-rbac-proxy-self/0.log" Apr 23 09:44:37.645989 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.645908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-ztjrb_d8e8e45c-0dda-4d4d-8f34-d004dce56258/monitoring-plugin/0.log" Apr 23 09:44:37.827616 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.827585 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vndrs_6f8dff77-f950-4d94-b370-b3392941773d/node-exporter/0.log" Apr 23 09:44:37.854982 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.854954 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vndrs_6f8dff77-f950-4d94-b370-b3392941773d/kube-rbac-proxy/0.log" Apr 23 09:44:37.876617 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.876590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vndrs_6f8dff77-f950-4d94-b370-b3392941773d/init-textfile/0.log" Apr 23 09:44:37.903827 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.903754 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fxpl2_566b4995-5609-43dd-8655-9d5459456f3e/kube-rbac-proxy-main/0.log" Apr 23 09:44:37.926497 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.926470 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fxpl2_566b4995-5609-43dd-8655-9d5459456f3e/kube-rbac-proxy-self/0.log" Apr 23 09:44:37.946568 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.946537 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fxpl2_566b4995-5609-43dd-8655-9d5459456f3e/openshift-state-metrics/0.log" Apr 23 09:44:37.984845 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:37.984817 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/prometheus/0.log" Apr 23 09:44:38.003683 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.003656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/config-reloader/0.log" Apr 23 09:44:38.024383 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.024355 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/thanos-sidecar/0.log" Apr 23 09:44:38.045409 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.045381 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/kube-rbac-proxy-web/0.log" Apr 23 09:44:38.070388 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.070361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/kube-rbac-proxy/0.log" Apr 23 09:44:38.098213 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.098160 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/kube-rbac-proxy-thanos/0.log" Apr 23 09:44:38.121553 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.121503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2c1cf94a-acf7-4db9-91bd-fd78724c8bbb/init-config-reloader/0.log" Apr 23 09:44:38.287983 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.287900 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f74d46c95-jtbt9_334e1be2-d4fe-48bc-8763-cb2bf329d81d/thanos-query/0.log" Apr 23 09:44:38.308535 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.308510 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f74d46c95-jtbt9_334e1be2-d4fe-48bc-8763-cb2bf329d81d/kube-rbac-proxy-web/0.log" Apr 23 09:44:38.331276 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.331251 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f74d46c95-jtbt9_334e1be2-d4fe-48bc-8763-cb2bf329d81d/kube-rbac-proxy/0.log" Apr 23 09:44:38.350227 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.350202 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f74d46c95-jtbt9_334e1be2-d4fe-48bc-8763-cb2bf329d81d/prom-label-proxy/0.log" Apr 23 09:44:38.378422 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.378397 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f74d46c95-jtbt9_334e1be2-d4fe-48bc-8763-cb2bf329d81d/kube-rbac-proxy-rules/0.log" Apr 23 09:44:38.403355 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:38.403307 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f74d46c95-jtbt9_334e1be2-d4fe-48bc-8763-cb2bf329d81d/kube-rbac-proxy-metrics/0.log" Apr 23 09:44:40.236450 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:40.236394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zdw5m_6dc929b3-11e5-41ca-90f0-b281ba8c1567/download-server/0.log" Apr 23 09:44:41.100085 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100039 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b"] Apr 23 09:44:41.100391 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100377 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="gather" Apr 23 09:44:41.100443 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100393 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="gather" Apr 23 09:44:41.100443 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100409 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="copy" Apr 23 09:44:41.100443 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100415 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="copy" Apr 23 09:44:41.100550 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100461 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="gather" Apr 23 09:44:41.100550 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.100473 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b17baa-b19e-4281-83a6-ececa9ab3692" containerName="copy" Apr 23 09:44:41.105062 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.105039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.107636 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.107615 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mpzrx\"/\"kube-root-ca.crt\"" Apr 23 09:44:41.108812 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.108796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mpzrx\"/\"default-dockercfg-xkkx2\"" Apr 23 09:44:41.108914 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.108830 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mpzrx\"/\"openshift-service-ca.crt\"" Apr 23 09:44:41.113407 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.113381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b"] Apr 23 09:44:41.142320 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.142286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-proc\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.142519 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.142332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl62c\" (UniqueName: \"kubernetes.io/projected/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-kube-api-access-bl62c\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.142519 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.142401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-sys\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.142519 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.142467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-lib-modules\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.142519 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.142492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-podres\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243563 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-proc\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl62c\" (UniqueName: \"kubernetes.io/projected/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-kube-api-access-bl62c\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-sys\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-lib-modules\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-proc\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-podres\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243753 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-podres\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-sys\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.243974 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.243791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-lib-modules\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.252099 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.252067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl62c\" (UniqueName: \"kubernetes.io/projected/d84e8df2-e3d9-4703-9722-c7c0eb22fb07-kube-api-access-bl62c\") pod \"perf-node-gather-daemonset-dkl9b\" (UID: \"d84e8df2-e3d9-4703-9722-c7c0eb22fb07\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.289123 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.289097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w7jwq_208a4652-fd62-4c8b-b1b9-542601fb566c/dns/0.log" Apr 23 09:44:41.307714 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.307677 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w7jwq_208a4652-fd62-4c8b-b1b9-542601fb566c/kube-rbac-proxy/0.log" Apr 23 09:44:41.367982 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.367902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xvcqj_bbce7cd3-954b-41fd-b6c9-9f47fee30477/dns-node-resolver/0.log" Apr 23 09:44:41.416018 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.415985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:41.541358 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.541324 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b"] Apr 23 09:44:41.544470 ip-10-0-129-154 kubenswrapper[2575]: W0423 09:44:41.544442 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd84e8df2_e3d9_4703_9722_c7c0eb22fb07.slice/crio-997fc301b01ed1d795159ad54adbc4e8b463cf4da26a1a2e2b52b70f4c99a1ad WatchSource:0}: Error finding container 997fc301b01ed1d795159ad54adbc4e8b463cf4da26a1a2e2b52b70f4c99a1ad: Status 404 returned error can't find the container with id 997fc301b01ed1d795159ad54adbc4e8b463cf4da26a1a2e2b52b70f4c99a1ad Apr 23 09:44:41.741468 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.741441 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-64d8c47674-2l6rs_f9dd5ac4-f74f-439e-8fb8-9f280d70427b/registry/0.log" Apr 23 09:44:41.805013 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:41.804979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qbs24_7f30d2db-f0de-4938-9291-99d089bb41d8/node-ca/0.log" Apr 23 09:44:42.042036 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:42.041994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" event={"ID":"d84e8df2-e3d9-4703-9722-c7c0eb22fb07","Type":"ContainerStarted","Data":"ac85144e0b1d31159f26b629edf9aa0c0a94cfc09fb51513b6d635b7dbdf66a8"} Apr 23 09:44:42.042241 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:42.042041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" event={"ID":"d84e8df2-e3d9-4703-9722-c7c0eb22fb07","Type":"ContainerStarted","Data":"997fc301b01ed1d795159ad54adbc4e8b463cf4da26a1a2e2b52b70f4c99a1ad"} Apr 23 09:44:42.042241 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:42.042063 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:42.059549 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:42.059499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" podStartSLOduration=1.05948465 podStartE2EDuration="1.05948465s" podCreationTimestamp="2026-04-23 09:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:44:42.05845711 +0000 UTC m=+835.321844259" watchObservedRunningTime="2026-04-23 09:44:42.05948465 +0000 UTC m=+835.322871800" Apr 23 09:44:42.774761 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:42.774725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tlwj2_198951b6-14d0-4d27-82e3-20e88b58ddc3/serve-healthcheck-canary/0.log" Apr 23 09:44:43.106080 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:43.106051 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fjgbq_301171ee-bdb8-44e3-a27c-edc3a9a811c3/kube-rbac-proxy/0.log" Apr 23 09:44:43.124348 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:43.124322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fjgbq_301171ee-bdb8-44e3-a27c-edc3a9a811c3/exporter/0.log" Apr 23 09:44:43.143618 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:43.143575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fjgbq_301171ee-bdb8-44e3-a27c-edc3a9a811c3/extractor/0.log" Apr 23 09:44:44.829798 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:44.829759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-4vqrz_9cd7095f-96a9-47eb-a770-b67fdfe65502/jobset-operator/0.log" Apr 23 09:44:47.743220 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:47.743118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lqdgp_d61a941a-d439-4bd9-ba1b-aa7f923ec15a/migrator/0.log" Apr 23 09:44:47.760122 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:47.760098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lqdgp_d61a941a-d439-4bd9-ba1b-aa7f923ec15a/graceful-termination/0.log" Apr 23 09:44:48.055366 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:48.055341 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-dkl9b" Apr 23 09:44:48.917286 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:48.917256 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/kube-multus-additional-cni-plugins/0.log" Apr 23 09:44:48.935773 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:48.935740 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/egress-router-binary-copy/0.log" Apr 23 09:44:48.953789 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:48.953765 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/cni-plugins/0.log" Apr 23 09:44:48.972647 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:48.972622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/bond-cni-plugin/0.log" Apr 23 09:44:49.006633 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:49.006610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/routeoverride-cni/0.log" Apr 23 09:44:49.060741 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:49.060711 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/whereabouts-cni-bincopy/0.log" Apr 23 09:44:49.079908 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:49.079877 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvdjb_d3bd2bf5-e581-45d4-b978-acec1bd86ea3/whereabouts-cni/0.log" Apr 23 09:44:49.275912 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:49.275829 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rbs48_7b5dd615-d19f-43a8-9c21-49d3b2cb8bcc/kube-multus/0.log" Apr 23 09:44:49.370684 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:49.370654 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nfwtj_9675e92f-c255-4d0a-a137-2fb828720d4d/network-metrics-daemon/0.log" Apr 23 09:44:49.388368 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:49.388340 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nfwtj_9675e92f-c255-4d0a-a137-2fb828720d4d/kube-rbac-proxy/0.log" Apr 23 09:44:50.852666 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.852627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/ovn-controller/0.log" Apr 23 09:44:50.877301 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.877254 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/ovn-acl-logging/0.log" Apr 23 09:44:50.895797 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.895768 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/kube-rbac-proxy-node/0.log" Apr 23 09:44:50.916278 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.916233 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:44:50.935935 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.935887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/northd/0.log" Apr 23 09:44:50.954687 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.954653 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/nbdb/0.log" Apr 23 09:44:50.976746 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:50.976710 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/sbdb/0.log" Apr 23 09:44:51.139277 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:51.139129 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7lqz_c2afc719-c33e-48f6-bacc-09ccb439d0fb/ovnkube-controller/0.log" Apr 23 09:44:52.172670 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:52.172635 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t5mzg_83e2352e-7188-4529-a79a-11d59e36b30b/network-check-target-container/0.log" Apr 23 09:44:52.963394 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:52.963370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-96h56_bea3c26f-67bd-4418-8a5c-830cf2936a15/iptables-alerter/0.log" Apr 23 09:44:53.614856 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:53.614824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xkbkt_4919ce05-148e-4367-8312-f7597a344990/tuned/0.log" Apr 23 09:44:56.801355 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:56.801324 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-qh6hr_de2aeb89-80c4-49f3-bb15-19cbda495e58/csi-driver/0.log" Apr 23 09:44:56.821656 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:56.821624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-qh6hr_de2aeb89-80c4-49f3-bb15-19cbda495e58/csi-node-driver-registrar/0.log" Apr 23 09:44:56.837415 ip-10-0-129-154 kubenswrapper[2575]: I0423 09:44:56.837379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-qh6hr_de2aeb89-80c4-49f3-bb15-19cbda495e58/csi-liveness-probe/0.log"