Apr 22 19:54:53.304801 ip-10-0-128-160 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:54:53.304815 ip-10-0-128-160 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:54:53.304825 ip-10-0-128-160 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:54:53.305140 ip-10-0-128-160 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:55:03.471862 ip-10-0-128-160 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:55:03.471880 ip-10-0-128-160 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2b5da5ce4bbc44969157727b093b05dd -- Apr 22 19:58:40.694193 ip-10-0-128-160 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:58:41.106880 ip-10-0-128-160 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:41.106880 ip-10-0-128-160 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:58:41.106880 ip-10-0-128-160 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:41.106880 ip-10-0-128-160 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:58:41.106880 ip-10-0-128-160 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:41.107631 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.107548 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:58:41.109760 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109745 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:41.109760 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109761 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109765 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109768 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109771 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109775 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109778 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109787 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109790 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109793 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109795 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109798 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109800 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109803 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109805 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109808 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109811 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109814 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109816 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109819 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109821 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:41.109822 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109824 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109829 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109833 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109836 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109839 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109842 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109845 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109847 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109850 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109852 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109855 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109857 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109860 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109862 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109865 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109867 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109870 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109873 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109877 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109881 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:41.110310 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109885 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109887 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109890 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109892 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109895 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109897 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109900 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109902 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109904 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109907 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109910 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109926 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109929 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109932 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109935 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109939 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109942 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109944 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109947 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:41.110794 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109949 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109952 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109955 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109958 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109960 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109963 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109966 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109968 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109971 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109973 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109976 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109978 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109981 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109985 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109987 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109990 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109992 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.109996 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110000 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110003 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:41.111276 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110006 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110008 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110011 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110013 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110016 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110018 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110387 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110394 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110399 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110403 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110407 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110410 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110413 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110416 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110419 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110422 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110424 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110427 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110430 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:41.111754 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110432 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110435 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110438 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110440 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110443 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110445 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110448 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110450 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110453 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110455 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110457 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110460 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110463 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110467 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110470 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110473 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110475 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110478 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110480 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:41.112265 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110483 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110487 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110489 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110492 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110495 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110497 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110500 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110503 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110505 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110508 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110510 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110513 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110515 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110523 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110526 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110529 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110531 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110534 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110536 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110538 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:41.112739 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110541 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110543 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110546 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110548 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110550 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110553 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110555 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110558 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110560 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110563 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110565 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110567 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110570 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110573 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110576 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110578 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110581 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110583 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110586 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110589 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:41.113250 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110591 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110593 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110596 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110598 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110601 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110603 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110611 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110614 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110617 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110620 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110622 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110625 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110627 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.110630 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110703 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110710 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110722 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110727 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110731 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110734 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110738 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:58:41.113742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110742 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110745 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110748 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110752 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110756 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110759 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110762 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110765 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110767 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110770 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110773 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110776 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110779 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110782 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110785 2575 flags.go:64] FLAG: --config-dir="" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110788 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110791 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110795 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110799 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110802 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110805 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110808 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110811 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110814 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110817 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:58:41.114262 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110820 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110824 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110827 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110830 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110832 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110835 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110838 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110843 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110846 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110849 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110853 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110856 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110860 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110863 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110866 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110869 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110872 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110874 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110877 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110880 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110883 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110886 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110888 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110892 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110895 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:58:41.114873 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110898 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110901 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110904 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110907 2575 flags.go:64] FLAG: --help="false" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110910 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110926 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110929 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110932 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110936 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110939 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110942 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110945 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110948 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110951 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110953 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110957 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110959 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110962 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110965 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110968 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110971 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110974 2575 flags.go:64] FLAG: --lock-file="" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110977 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110980 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:58:41.115498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110983 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110988 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110991 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110994 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110997 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.110999 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111003 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111005 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111008 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111012 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111016 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111020 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111023 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111026 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111029 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111032 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111034 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111037 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111040 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111047 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111050 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111053 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111056 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:58:41.116110 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111059 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111065 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111067 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111070 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111073 2575 flags.go:64] FLAG: --port="10250" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111077 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111080 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0883abc7036c63b9d" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111085 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111088 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111091 2575 flags.go:64] FLAG: --register-node="true" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111094 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111097 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111101 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111104 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111107 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111109 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111113 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111116 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111119 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111122 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111131 2575 flags.go:64] FLAG: --runonce="false" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111134 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111138 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111141 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111144 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111146 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:58:41.116788 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111149 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111152 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111155 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111158 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111161 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111164 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111167 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111170 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111173 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111175 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111188 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111191 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111194 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111199 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111202 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111205 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111208 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111211 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111214 2575 flags.go:64] FLAG: --v="2" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111218 2575 flags.go:64] FLAG: --version="false" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111222 2575 flags.go:64] FLAG: --vmodule="" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111227 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.111230 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111324 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111328 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:41.117466 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111331 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111333 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111339 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111342 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111345 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111348 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111350 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111353 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111355 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111358 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111360 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111363 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111365 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111368 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111371 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111373 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111376 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111381 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111384 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111386 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:41.118106 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111390 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111393 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111396 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111398 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111401 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111403 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111406 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111409 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111411 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111414 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111416 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111418 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111421 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111423 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111426 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111432 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111435 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111437 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111440 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111442 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:41.118599 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111445 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111447 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111450 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111452 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111455 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111458 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111460 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111463 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111466 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111469 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111472 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111474 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111478 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111481 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111483 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111486 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111488 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111490 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111494 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111498 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:41.119091 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111501 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111503 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111506 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111508 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111511 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111513 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111515 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111518 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111522 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111526 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111529 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111532 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111534 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111537 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111540 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111543 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111545 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111548 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111551 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:41.119573 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111553 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111556 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111561 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111563 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.111566 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.112281 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.118192 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.118207 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118253 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118258 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118262 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118265 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118268 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118271 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118274 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:41.120069 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118277 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118279 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118282 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118285 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118288 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118291 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118293 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118296 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118299 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118302 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118304 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118307 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118309 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118312 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118315 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118317 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118321 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118325 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118328 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118330 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:41.120456 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118333 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118335 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118338 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118340 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118344 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118347 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118350 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118352 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118355 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118357 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118361 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118365 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118368 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118371 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118374 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118377 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118379 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118382 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118385 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:41.121050 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118388 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118390 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118393 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118396 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118398 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118401 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118404 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118407 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118409 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118412 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118414 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118417 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118420 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118422 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118425 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118427 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118430 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118433 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118436 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118439 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:41.121545 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118441 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118444 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118447 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118449 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118452 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118454 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118457 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118459 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118461 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118464 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118467 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118469 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118472 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118474 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118477 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118480 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118482 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118485 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118487 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:41.122046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118490 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.118495 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118588 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118593 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118596 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118599 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118602 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118605 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118608 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118611 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118614 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118617 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118620 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118623 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118626 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:41.122558 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118629 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118631 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118634 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118636 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118639 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118642 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118644 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118647 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118649 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118652 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118654 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118657 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118659 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118663 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118666 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118669 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118672 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118674 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118677 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118680 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:41.122949 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118682 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118685 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118688 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118690 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118693 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118696 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118698 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118701 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118704 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118706 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118709 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118712 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118714 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118717 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118720 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118722 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118725 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118727 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118730 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118732 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:41.123444 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118735 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118737 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118739 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118742 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118745 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118749 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118753 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118755 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118758 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118761 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118764 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118766 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118769 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118771 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118774 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118776 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118779 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118781 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118784 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:41.123931 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118786 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118789 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118791 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118794 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118799 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118801 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118804 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118806 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118809 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118811 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118814 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118816 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118819 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:41.118821 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.118826 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:41.124413 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.119520 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:58:41.124824 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.122089 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:58:41.124824 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.122891 2575 server.go:1019] "Starting client certificate rotation" Apr 22 19:58:41.124824 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.122982 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:41.124824 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.123733 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:41.148436 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.148417 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:41.152013 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.151993 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:41.163300 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.163281 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:58:41.169422 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.169407 2575 log.go:25] "Validated CRI v1 image API" Apr 22 19:58:41.173434 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.173418 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:58:41.176394 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.176375 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:41.181261 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.181243 2575 fs.go:135] Filesystem UUIDs: map[05eea702-6558-4019-b864-1c65773fff28:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fbce74e5-e9ec-4f36-b142-b15c511be0a7:/dev/nvme0n1p3] Apr 22 19:58:41.181316 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.181262 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:58:41.186553 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.186445 2575 manager.go:217] Machine: {Timestamp:2026-04-22 19:58:41.184749054 +0000 UTC m=+0.382069931 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107900 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2685e9ac5e017558ea6b891dbd1a2c SystemUUID:ec2685e9-ac5e-0175-58ea-6b891dbd1a2c BootID:2b5da5ce-4bbc-4496-9157-727b093b05dd Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6d:41:08:bb:1f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6d:41:08:bb:1f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:52:b6:b1:25:71 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:58:41.186553 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.186551 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:58:41.186658 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.186644 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:58:41.187636 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.187616 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:58:41.187775 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.187639 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-160.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:58:41.187819 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.187784 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:58:41.187819 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.187792 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:58:41.187819 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.187805 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:41.188524 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.188512 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:41.189352 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.189342 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:41.189457 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.189448 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:58:41.191790 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.191780 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:58:41.191833 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.191794 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:58:41.191833 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.191809 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:58:41.191833 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.191818 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:58:41.191833 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.191825 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:58:41.192709 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.192697 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:41.192746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.192716 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:41.195562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.195535 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9jdbn" Apr 22 19:58:41.195648 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.195571 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:58:41.197156 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.197136 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:58:41.198641 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198630 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198646 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198652 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198658 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198664 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198669 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198675 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198682 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:58:41.198685 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198689 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:58:41.199025 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198695 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:58:41.199025 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198709 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:58:41.199025 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.198718 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:58:41.199495 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.199484 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:58:41.199524 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.199495 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:58:41.200741 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.200725 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9jdbn" Apr 22 19:58:41.202949 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.202931 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-160.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:58:41.203219 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.203206 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:58:41.203281 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.203240 2575 server.go:1295] "Started kubelet" Apr 22 19:58:41.203548 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.203516 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:58:41.203584 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.203560 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:58:41.203716 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.203701 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:58:41.203716 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.203700 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:58:41.203821 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.203752 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:58:41.204011 ip-10-0-128-160 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:58:41.204876 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.204824 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:58:41.205665 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.205648 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:58:41.208568 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.208445 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:41.209018 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.208991 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:58:41.209645 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209624 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:58:41.209728 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209704 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:58:41.209728 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209723 2575 factory.go:55] Registering systemd factory Apr 22 19:58:41.209823 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209739 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:58:41.209823 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209625 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:58:41.209823 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209814 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:58:41.209823 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209825 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:58:41.210014 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209963 2575 factory.go:153] Registering CRI-O factory Apr 22 19:58:41.210014 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.209979 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 19:58:41.210104 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.210060 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:58:41.210104 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.210084 2575 factory.go:103] Registering Raw factory Apr 22 19:58:41.210104 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.210099 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 19:58:41.210311 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.210287 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.210566 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.210554 2575 manager.go:319] Starting recovery of all containers Apr 22 19:58:41.211055 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.211025 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:41.221194 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.221171 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-160.ec2.internal\" not found" node="ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.225539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.225522 2575 manager.go:324] Recovery completed Apr 22 19:58:41.229376 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.229363 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:41.231557 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.231541 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:41.231623 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.231569 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:41.231623 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.231581 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:41.232088 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.232070 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:58:41.232088 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.232085 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:58:41.232224 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.232101 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:41.235140 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.235128 2575 policy_none.go:49] "None policy: Start" Apr 22 19:58:41.235204 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.235144 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:58:41.235204 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.235153 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:58:41.264449 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264430 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 19:58:41.264526 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.264458 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:58:41.264526 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264470 2575 server.go:85] "Starting device plugin registration server" Apr 22 19:58:41.264720 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264703 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:58:41.264811 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264719 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:58:41.264864 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264833 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:58:41.264939 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264906 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:58:41.264939 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.264932 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:58:41.265553 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.265468 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:58:41.265553 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.265502 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.328728 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.328704 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:58:41.330796 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.329858 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:58:41.330796 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.329878 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:58:41.330796 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.329893 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:58:41.330796 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.329899 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:58:41.330796 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.329944 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:58:41.332149 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.332122 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:41.365056 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.365008 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:41.366016 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.365998 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:41.366103 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.366023 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:41.366103 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.366034 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:41.366103 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.366054 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.375564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.375549 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.375608 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.375572 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-160.ec2.internal\": node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.400732 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.400709 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.431027 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.431008 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal"] Apr 22 19:58:41.431077 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.431062 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:41.431821 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.431808 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:41.431892 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.431839 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:41.431892 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.431854 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:41.433996 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.433981 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:41.434146 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.434133 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.434193 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.434158 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:41.434988 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.434972 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:41.435055 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.434998 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:41.435055 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.435007 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:41.435150 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.434973 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:41.435150 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.435084 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:41.435150 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.435100 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:41.437246 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.437229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.437322 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.437258 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:41.438127 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.438114 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:41.438206 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.438135 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:41.438206 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.438147 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:41.455462 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.455438 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-160.ec2.internal\" not found" node="ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.459632 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.459619 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-160.ec2.internal\" not found" node="ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.501283 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.501259 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.511107 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.511087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31d5ae5fa3d2b4408ebb8dcb9abd1e84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal\" (UID: \"31d5ae5fa3d2b4408ebb8dcb9abd1e84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.511185 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.511110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c49858937a727d87766821f66c42c625-config\") pod \"kube-apiserver-proxy-ip-10-0-128-160.ec2.internal\" (UID: \"c49858937a727d87766821f66c42c625\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.511185 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.511127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/31d5ae5fa3d2b4408ebb8dcb9abd1e84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal\" (UID: \"31d5ae5fa3d2b4408ebb8dcb9abd1e84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.601501 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.601475 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.611797 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.611780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31d5ae5fa3d2b4408ebb8dcb9abd1e84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal\" (UID: \"31d5ae5fa3d2b4408ebb8dcb9abd1e84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.611854 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.611804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c49858937a727d87766821f66c42c625-config\") pod \"kube-apiserver-proxy-ip-10-0-128-160.ec2.internal\" (UID: \"c49858937a727d87766821f66c42c625\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.611854 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.611829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/31d5ae5fa3d2b4408ebb8dcb9abd1e84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal\" (UID: \"31d5ae5fa3d2b4408ebb8dcb9abd1e84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.611940 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.611871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31d5ae5fa3d2b4408ebb8dcb9abd1e84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal\" (UID: \"31d5ae5fa3d2b4408ebb8dcb9abd1e84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.611940 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.611885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c49858937a727d87766821f66c42c625-config\") pod \"kube-apiserver-proxy-ip-10-0-128-160.ec2.internal\" (UID: \"c49858937a727d87766821f66c42c625\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.612018 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.611854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/31d5ae5fa3d2b4408ebb8dcb9abd1e84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal\" (UID: \"31d5ae5fa3d2b4408ebb8dcb9abd1e84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.702214 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.702167 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.757662 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.757647 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.761973 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:41.761960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" Apr 22 19:58:41.802374 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.802353 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:41.902899 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:41.902875 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.003362 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.003312 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.103862 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.103839 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.123250 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.123229 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:58:42.123862 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.123356 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:42.123862 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.123388 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:42.202776 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.202736 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:53:41 +0000 UTC" deadline="2027-11-05 21:34:01.699672865 +0000 UTC" Apr 22 19:58:42.202776 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.202768 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13489h35m19.496908203s" Apr 22 19:58:42.203979 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.203959 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.208968 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.208951 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:42.219876 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.219860 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:42.223708 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.223692 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:42.240987 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.240963 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lrjfp" Apr 22 19:58:42.248295 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.248278 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lrjfp" Apr 22 19:58:42.250119 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:42.250095 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d5ae5fa3d2b4408ebb8dcb9abd1e84.slice/crio-6bfd782505ddbd171424af338806fde8f604e1df3aa6cbff4d1b1daf64b970cf WatchSource:0}: Error finding container 6bfd782505ddbd171424af338806fde8f604e1df3aa6cbff4d1b1daf64b970cf: Status 404 returned error can't find the container with id 6bfd782505ddbd171424af338806fde8f604e1df3aa6cbff4d1b1daf64b970cf Apr 22 19:58:42.250405 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:42.250385 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49858937a727d87766821f66c42c625.slice/crio-23d3b3e46ae15ce97e661524d26370fa2f21b058944fb48f98502cf944293356 WatchSource:0}: Error finding container 23d3b3e46ae15ce97e661524d26370fa2f21b058944fb48f98502cf944293356: Status 404 returned error can't find the container with id 23d3b3e46ae15ce97e661524d26370fa2f21b058944fb48f98502cf944293356 Apr 22 19:58:42.254224 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.254210 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:58:42.305008 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.304985 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.333148 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.333101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" event={"ID":"31d5ae5fa3d2b4408ebb8dcb9abd1e84","Type":"ContainerStarted","Data":"6bfd782505ddbd171424af338806fde8f604e1df3aa6cbff4d1b1daf64b970cf"} Apr 22 19:58:42.333992 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.333974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" event={"ID":"c49858937a727d87766821f66c42c625","Type":"ContainerStarted","Data":"23d3b3e46ae15ce97e661524d26370fa2f21b058944fb48f98502cf944293356"} Apr 22 19:58:42.405103 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.405085 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.505537 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.505493 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.606098 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.606074 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.706759 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:42.706731 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-160.ec2.internal\" not found" Apr 22 19:58:42.785323 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.785109 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:42.809766 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.809736 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" Apr 22 19:58:42.823245 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.823221 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:42.824092 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.824071 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" Apr 22 19:58:42.831126 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:42.831106 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:43.193244 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.193176 2575 apiserver.go:52] "Watching apiserver" Apr 22 19:58:43.202503 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.202481 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:58:43.204602 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.204546 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-52sgh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal","openshift-multus/multus-v7xx6","openshift-network-diagnostics/network-check-target-9tw8t","openshift-dns/node-resolver-hrntb","openshift-multus/multus-additional-cni-plugins-vn8pw","openshift-multus/network-metrics-daemon-rw25k","openshift-network-operator/iptables-alerter-5sz4q","openshift-ovn-kubernetes/ovnkube-node-jw84m","kube-system/konnectivity-agent-hjrqd","kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w","openshift-cluster-node-tuning-operator/tuned-8wmlw"] Apr 22 19:58:43.209433 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.209411 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.209535 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.209417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.209761 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.209605 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:43.211747 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.211722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:43.211847 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.211791 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:43.212116 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.212093 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.212235 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.212191 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:58:43.212348 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.212330 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.212409 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.212374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:58:43.212461 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.212417 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rrllm\"" Apr 22 19:58:43.216476 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.216082 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.216476 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.216199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.218353 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.218325 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.218818 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.218799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.218900 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.218871 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:58:43.218982 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.218911 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.219039 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.218982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-netns\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219039 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7782d\"" Apr 22 19:58:43.219039 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-kubelet\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219185 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-hostroot\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219185 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-multus-certs\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219285 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:43.219285 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219207 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:58:43.219285 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec02ec42-2641-4c07-aa33-0277c20a77a7-cni-binary-copy\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219285 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219239 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wvvk9\"" Apr 22 19:58:43.219285 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-socket-dir-parent\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219285 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-cni-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-cni-bin\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-cni-multus\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-conf-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-daemon-config\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhkf\" (UniqueName: \"kubernetes.io/projected/ec02ec42-2641-4c07-aa33-0277c20a77a7-kube-api-access-8bhkf\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rgmm\" (UniqueName: \"kubernetes.io/projected/7c3e7957-917d-44e3-8833-76ccc4a5d167-kube-api-access-8rgmm\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-cnibin\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219564 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219553 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-os-release\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219953 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-k8s-cni-cncf-io\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219953 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-etc-kubernetes\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.219953 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.219953 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.219661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-system-cni-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.220526 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.220506 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:58:43.220661 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.220641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.222745 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.221882 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.222745 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.222022 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.222745 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.222284 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-smzgt\"" Apr 22 19:58:43.224179 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.223234 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.224179 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.223659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.224179 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.224068 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wnd2q\"" Apr 22 19:58:43.224364 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.224186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:58:43.225159 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.225137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.228211 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228192 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.228563 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228538 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.228649 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rshfs\"" Apr 22 19:58:43.228709 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:58:43.228709 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228696 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:58:43.228825 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228806 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:58:43.228825 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228633 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:58:43.229004 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.228872 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.230338 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.230317 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:58:43.230846 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.230677 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:58:43.230846 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.230732 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p42gv\"" Apr 22 19:58:43.231007 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.230935 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.233247 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.233230 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.233320 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.233258 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.233599 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.233580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.233599 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.233597 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k7gmp\"" Apr 22 19:58:43.233731 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.233662 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:58:43.235536 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.235506 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:43.235807 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.235787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vmj5p\"" Apr 22 19:58:43.235935 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.235900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:43.249175 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.249153 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:42 +0000 UTC" deadline="2027-09-17 03:58:25.50675305 +0000 UTC" Apr 22 19:58:43.249175 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.249173 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12295h59m42.257582535s" Apr 22 19:58:43.310592 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.310572 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:58:43.320403 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwbdm\" (UniqueName: \"kubernetes.io/projected/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-kube-api-access-vwbdm\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.320403 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-os-release\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.320568 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfp7\" (UniqueName: \"kubernetes.io/projected/bd664ecc-6372-4249-9d3f-4ca9da5c429c-kube-api-access-mpfp7\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.320568 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-var-lib-kubelet\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.320568 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-cni-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.320568 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320553 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-hosts-file\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-etc-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cms5\" (UniqueName: \"kubernetes.io/projected/23005632-886a-45ba-ac15-5a0a1282248f-kube-api-access-2cms5\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0155db59-b48d-4919-97a6-8855d675375d-tmp\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-cni-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-os-release\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-etc-kubernetes\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-os-release\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.320756 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-kubelet\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-etc-kubernetes\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovn-node-metrics-cert\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-hostroot\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a09841fa-cd66-4311-9794-2efee23e5727-serviceca\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-hostroot\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-registration-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320942 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysconfig\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-lib-modules\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.320989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-run-netns\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-device-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-cni-multus\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cni-binary-copy\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-modprobe-d\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-k8s-cni-cncf-io\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c-agent-certs\") pod \"konnectivity-agent-hjrqd\" (UID: \"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c\") " pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-cni-multus\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-sys-fs\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-k8s-cni-cncf-io\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-run\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-system-cni-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.321439 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-multus-certs\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-system-cni-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.321530 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:43.821487461 +0000 UTC m=+3.018808329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:43.321562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-multus-certs\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a09841fa-cd66-4311-9794-2efee23e5727-host\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-slash\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-env-overrides\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec02ec42-2641-4c07-aa33-0277c20a77a7-cni-binary-copy\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-ovn\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovnkube-script-lib\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.321973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-socket-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-kubernetes\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7fw\" (UniqueName: \"kubernetes.io/projected/0155db59-b48d-4919-97a6-8855d675375d-kube-api-access-bq7fw\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-daemon-config\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-system-cni-dir\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-cnibin\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cnibin\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.322266 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-systemd-units\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0155db59-b48d-4919-97a6-8855d675375d-etc-tuned\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-cnibin\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-netns\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-kubelet\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-kubelet\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-run-netns\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c3fdb762-05cf-4828-8c3e-45c50aca2528-iptables-alerter-script\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c-konnectivity-ca\") pod \"konnectivity-agent-hjrqd\" (UID: \"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c\") " pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-run-ovn-kubernetes\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-cni-netd\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-systemd\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-socket-dir-parent\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3fdb762-05cf-4828-8c3e-45c50aca2528-host-slash\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-socket-dir-parent\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-etc-selinux\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.323023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysctl-d\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysctl-conf\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-cni-bin\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-conf-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhkf\" (UniqueName: \"kubernetes.io/projected/ec02ec42-2641-4c07-aa33-0277c20a77a7-kube-api-access-8bhkf\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-host-var-lib-cni-bin\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-node-log\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-conf-dir\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-log-socket\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-sys\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-host\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rgmm\" (UniqueName: \"kubernetes.io/projected/7c3e7957-917d-44e3-8833-76ccc4a5d167-kube-api-access-8rgmm\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65xt9\" (UniqueName: \"kubernetes.io/projected/c3fdb762-05cf-4828-8c3e-45c50aca2528-kube-api-access-65xt9\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-systemd\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-var-lib-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.322991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-cni-bin\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.323672 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.323015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovnkube-config\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.324296 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.323022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec02ec42-2641-4c07-aa33-0277c20a77a7-cni-binary-copy\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.324296 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.323040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4xsj\" (UniqueName: \"kubernetes.io/projected/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-kube-api-access-d4xsj\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.324296 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.323063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbdzw\" (UniqueName: \"kubernetes.io/projected/a09841fa-cd66-4311-9794-2efee23e5727-kube-api-access-nbdzw\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.324296 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.323079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ec02ec42-2641-4c07-aa33-0277c20a77a7-multus-daemon-config\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.324296 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.323089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-tmp-dir\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.329894 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.329872 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:43.330017 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.329902 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:43.330017 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.329940 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:43.330017 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.330001 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:43.829985104 +0000 UTC m=+3.027306001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:43.330496 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.330477 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:58:43.333662 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.333637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhkf\" (UniqueName: \"kubernetes.io/projected/ec02ec42-2641-4c07-aa33-0277c20a77a7-kube-api-access-8bhkf\") pod \"multus-v7xx6\" (UID: \"ec02ec42-2641-4c07-aa33-0277c20a77a7\") " pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.333744 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.333681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rgmm\" (UniqueName: \"kubernetes.io/projected/7c3e7957-917d-44e3-8833-76ccc4a5d167-kube-api-access-8rgmm\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.339474 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.339456 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:43.424295 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cnibin\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.424295 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-systemd-units\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0155db59-b48d-4919-97a6-8855d675375d-etc-tuned\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c3fdb762-05cf-4828-8c3e-45c50aca2528-iptables-alerter-script\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cnibin\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c-konnectivity-ca\") pod \"konnectivity-agent-hjrqd\" (UID: \"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c\") " pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-systemd-units\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-run-ovn-kubernetes\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-cni-netd\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424461 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-systemd\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3fdb762-05cf-4828-8c3e-45c50aca2528-host-slash\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.424509 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-run-ovn-kubernetes\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-cni-netd\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3fdb762-05cf-4828-8c3e-45c50aca2528-host-slash\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-systemd\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-etc-selinux\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysctl-d\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysctl-conf\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-node-log\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-log-socket\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-sys\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysctl-d\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-host\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65xt9\" (UniqueName: \"kubernetes.io/projected/c3fdb762-05cf-4828-8c3e-45c50aca2528-kube-api-access-65xt9\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-systemd\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-var-lib-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.424986 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c-konnectivity-ca\") pod \"konnectivity-agent-hjrqd\" (UID: \"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c\") " pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-cni-bin\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c3fdb762-05cf-4828-8c3e-45c50aca2528-iptables-alerter-script\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-etc-selinux\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovnkube-config\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysctl-conf\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4xsj\" (UniqueName: \"kubernetes.io/projected/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-kube-api-access-d4xsj\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.424999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-node-log\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-systemd\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-host\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbdzw\" (UniqueName: \"kubernetes.io/projected/a09841fa-cd66-4311-9794-2efee23e5727-kube-api-access-nbdzw\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-tmp-dir\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwbdm\" (UniqueName: \"kubernetes.io/projected/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-kube-api-access-vwbdm\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-os-release\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfp7\" (UniqueName: \"kubernetes.io/projected/bd664ecc-6372-4249-9d3f-4ca9da5c429c-kube-api-access-mpfp7\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-log-socket\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.425746 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-var-lib-kubelet\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-os-release\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-hosts-file\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-hosts-file\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-sys\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-etc-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cms5\" (UniqueName: \"kubernetes.io/projected/23005632-886a-45ba-ac15-5a0a1282248f-kube-api-access-2cms5\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-cni-bin\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0155db59-b48d-4919-97a6-8855d675375d-tmp\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-var-lib-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-var-lib-kubelet\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-kubelet\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-kubelet\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-etc-openvswitch\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovn-node-metrics-cert\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.426501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a09841fa-cd66-4311-9794-2efee23e5727-serviceca\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-registration-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysconfig\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-registration-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-lib-modules\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.425989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-run-netns\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-device-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cni-binary-copy\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-modprobe-d\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c-agent-certs\") pod \"konnectivity-agent-hjrqd\" (UID: \"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c\") " pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-sys-fs\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-run\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-sys-fs\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovnkube-config\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a09841fa-cd66-4311-9794-2efee23e5727-serviceca\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a09841fa-cd66-4311-9794-2efee23e5727-host\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.427317 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-sysconfig\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-run\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-slash\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-env-overrides\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-run-netns\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-lib-modules\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-slash\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-modprobe-d\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-device-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-ovn\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovnkube-script-lib\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.426995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd664ecc-6372-4249-9d3f-4ca9da5c429c-cni-binary-copy\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-socket-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-kubernetes\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.427997 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7fw\" (UniqueName: \"kubernetes.io/projected/0155db59-b48d-4919-97a6-8855d675375d-kube-api-access-bq7fw\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-env-overrides\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0155db59-b48d-4919-97a6-8855d675375d-etc-kubernetes\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-system-cni-dir\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-run-ovn\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-tmp-dir\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd664ecc-6372-4249-9d3f-4ca9da5c429c-system-cni-dir\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a09841fa-cd66-4311-9794-2efee23e5727-host\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/23005632-886a-45ba-ac15-5a0a1282248f-socket-dir\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0155db59-b48d-4919-97a6-8855d675375d-tmp\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0155db59-b48d-4919-97a6-8855d675375d-etc-tuned\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.427974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovnkube-script-lib\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.428679 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.428330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-ovn-node-metrics-cert\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.429215 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.428905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c-agent-certs\") pod \"konnectivity-agent-hjrqd\" (UID: \"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c\") " pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.433938 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.433899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfp7\" (UniqueName: \"kubernetes.io/projected/bd664ecc-6372-4249-9d3f-4ca9da5c429c-kube-api-access-mpfp7\") pod \"multus-additional-cni-plugins-vn8pw\" (UID: \"bd664ecc-6372-4249-9d3f-4ca9da5c429c\") " pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.434467 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.434422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwbdm\" (UniqueName: \"kubernetes.io/projected/7047d9ee-3de6-4d53-a8d2-f3e0ea19b774-kube-api-access-vwbdm\") pod \"node-resolver-hrntb\" (UID: \"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774\") " pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.434790 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.434751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbdzw\" (UniqueName: \"kubernetes.io/projected/a09841fa-cd66-4311-9794-2efee23e5727-kube-api-access-nbdzw\") pod \"node-ca-52sgh\" (UID: \"a09841fa-cd66-4311-9794-2efee23e5727\") " pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.434992 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.434969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4xsj\" (UniqueName: \"kubernetes.io/projected/8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed-kube-api-access-d4xsj\") pod \"ovnkube-node-jw84m\" (UID: \"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.435075 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.435025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65xt9\" (UniqueName: \"kubernetes.io/projected/c3fdb762-05cf-4828-8c3e-45c50aca2528-kube-api-access-65xt9\") pod \"iptables-alerter-5sz4q\" (UID: \"c3fdb762-05cf-4828-8c3e-45c50aca2528\") " pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.435969 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.435944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cms5\" (UniqueName: \"kubernetes.io/projected/23005632-886a-45ba-ac15-5a0a1282248f-kube-api-access-2cms5\") pod \"aws-ebs-csi-driver-node-5x25w\" (UID: \"23005632-886a-45ba-ac15-5a0a1282248f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.436049 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.436004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7fw\" (UniqueName: \"kubernetes.io/projected/0155db59-b48d-4919-97a6-8855d675375d-kube-api-access-bq7fw\") pod \"tuned-8wmlw\" (UID: \"0155db59-b48d-4919-97a6-8855d675375d\") " pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.456141 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.456088 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:43.520656 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.520632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v7xx6" Apr 22 19:58:43.529350 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.529331 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hrntb" Apr 22 19:58:43.537858 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.537843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" Apr 22 19:58:43.545400 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.545379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-52sgh" Apr 22 19:58:43.549880 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.549862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5sz4q" Apr 22 19:58:43.555532 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.555514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:58:43.560841 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.560822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:58:43.566784 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.566765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" Apr 22 19:58:43.571332 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.571317 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" Apr 22 19:58:43.830301 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.830273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:43.830401 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:43.830308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:43.830441 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.830409 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:43.830441 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.830410 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:43.830441 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.830420 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:43.830441 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.830431 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:43.830560 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.830470 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:44.83045195 +0000 UTC m=+4.027772827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:43.830560 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:43.830486 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:44.830478774 +0000 UTC m=+4.027799634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:43.836322 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.836299 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc4cd36_7e91_452b_b1f8_8c4ebd9f39ed.slice/crio-0d05c7dacec8f80e22247bac7897438e3455f129d4a7114c40d8a7d0b3ae5b80 WatchSource:0}: Error finding container 0d05c7dacec8f80e22247bac7897438e3455f129d4a7114c40d8a7d0b3ae5b80: Status 404 returned error can't find the container with id 0d05c7dacec8f80e22247bac7897438e3455f129d4a7114c40d8a7d0b3ae5b80 Apr 22 19:58:43.838192 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.838000 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec02ec42_2641_4c07_aa33_0277c20a77a7.slice/crio-6380e2fa4f72b48d8cde4707d2cee24487fedb84c6eebef493e4a0449e5d054b WatchSource:0}: Error finding container 6380e2fa4f72b48d8cde4707d2cee24487fedb84c6eebef493e4a0449e5d054b: Status 404 returned error can't find the container with id 6380e2fa4f72b48d8cde4707d2cee24487fedb84c6eebef493e4a0449e5d054b Apr 22 19:58:43.841016 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.840954 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd664ecc_6372_4249_9d3f_4ca9da5c429c.slice/crio-59ee7dc24863331468e86b3b824c324b3bc5b9cca9417c75a2ae426c63d59359 WatchSource:0}: Error finding container 59ee7dc24863331468e86b3b824c324b3bc5b9cca9417c75a2ae426c63d59359: Status 404 returned error can't find the container with id 59ee7dc24863331468e86b3b824c324b3bc5b9cca9417c75a2ae426c63d59359 Apr 22 19:58:43.841978 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.841958 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09841fa_cd66_4311_9794_2efee23e5727.slice/crio-4ea8edb04a4b3593664efce129aa982acc996307323436cd78f859acde36f93a WatchSource:0}: Error finding container 4ea8edb04a4b3593664efce129aa982acc996307323436cd78f859acde36f93a: Status 404 returned error can't find the container with id 4ea8edb04a4b3593664efce129aa982acc996307323436cd78f859acde36f93a Apr 22 19:58:43.842546 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.842502 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ea8eea5_b4b4_40f6_a297_6b5a87a93a8c.slice/crio-56b17dd45b74a44937fb01764f11f20c5585812d55063c61cb701e2b3cfb677e WatchSource:0}: Error finding container 56b17dd45b74a44937fb01764f11f20c5585812d55063c61cb701e2b3cfb677e: Status 404 returned error can't find the container with id 56b17dd45b74a44937fb01764f11f20c5585812d55063c61cb701e2b3cfb677e Apr 22 19:58:43.843407 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.843337 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0155db59_b48d_4919_97a6_8855d675375d.slice/crio-8363e0075731374af8d22fd0d17bb9133571b3de9063f6a5c7a123ec7730b415 WatchSource:0}: Error finding container 8363e0075731374af8d22fd0d17bb9133571b3de9063f6a5c7a123ec7730b415: Status 404 returned error can't find the container with id 8363e0075731374af8d22fd0d17bb9133571b3de9063f6a5c7a123ec7730b415 Apr 22 19:58:43.844351 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.844237 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fdb762_05cf_4828_8c3e_45c50aca2528.slice/crio-62578de7e3b7bfda0b53a371d09a4381a47ef87f60aa2c79fcffff727ac2aee2 WatchSource:0}: Error finding container 62578de7e3b7bfda0b53a371d09a4381a47ef87f60aa2c79fcffff727ac2aee2: Status 404 returned error can't find the container with id 62578de7e3b7bfda0b53a371d09a4381a47ef87f60aa2c79fcffff727ac2aee2 Apr 22 19:58:43.845475 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.845454 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23005632_886a_45ba_ac15_5a0a1282248f.slice/crio-6559bb25b09c6fe902dfa2a6d17e3430a0cb0c3f32e326f04b21f61b5b7aff29 WatchSource:0}: Error finding container 6559bb25b09c6fe902dfa2a6d17e3430a0cb0c3f32e326f04b21f61b5b7aff29: Status 404 returned error can't find the container with id 6559bb25b09c6fe902dfa2a6d17e3430a0cb0c3f32e326f04b21f61b5b7aff29 Apr 22 19:58:43.845981 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:58:43.845885 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7047d9ee_3de6_4d53_a8d2_f3e0ea19b774.slice/crio-9120cc7357790920b93fd690f6bd93d40f0bae9996cd893204498fdaa860b82c WatchSource:0}: Error finding container 9120cc7357790920b93fd690f6bd93d40f0bae9996cd893204498fdaa860b82c: Status 404 returned error can't find the container with id 9120cc7357790920b93fd690f6bd93d40f0bae9996cd893204498fdaa860b82c Apr 22 19:58:44.250469 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.250313 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:42 +0000 UTC" deadline="2028-02-08 19:09:09.890988596 +0000 UTC" Apr 22 19:58:44.250469 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.250349 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15767h10m25.640643086s" Apr 22 19:58:44.331313 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.330676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:44.331313 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.330814 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:44.355975 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.355895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v7xx6" event={"ID":"ec02ec42-2641-4c07-aa33-0277c20a77a7","Type":"ContainerStarted","Data":"6380e2fa4f72b48d8cde4707d2cee24487fedb84c6eebef493e4a0449e5d054b"} Apr 22 19:58:44.357816 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.357746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"0d05c7dacec8f80e22247bac7897438e3455f129d4a7114c40d8a7d0b3ae5b80"} Apr 22 19:58:44.369790 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.369752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hrntb" event={"ID":"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774","Type":"ContainerStarted","Data":"9120cc7357790920b93fd690f6bd93d40f0bae9996cd893204498fdaa860b82c"} Apr 22 19:58:44.380814 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.380777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" event={"ID":"23005632-886a-45ba-ac15-5a0a1282248f","Type":"ContainerStarted","Data":"6559bb25b09c6fe902dfa2a6d17e3430a0cb0c3f32e326f04b21f61b5b7aff29"} Apr 22 19:58:44.390126 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.390097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" event={"ID":"0155db59-b48d-4919-97a6-8855d675375d","Type":"ContainerStarted","Data":"8363e0075731374af8d22fd0d17bb9133571b3de9063f6a5c7a123ec7730b415"} Apr 22 19:58:44.394857 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.394828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerStarted","Data":"59ee7dc24863331468e86b3b824c324b3bc5b9cca9417c75a2ae426c63d59359"} Apr 22 19:58:44.407486 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.407458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" event={"ID":"c49858937a727d87766821f66c42c625","Type":"ContainerStarted","Data":"9cd53e32347f9a8767b4c3e770287f5ca6bfbf179470c8939b286e2de905e8c1"} Apr 22 19:58:44.411803 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.411743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5sz4q" event={"ID":"c3fdb762-05cf-4828-8c3e-45c50aca2528","Type":"ContainerStarted","Data":"62578de7e3b7bfda0b53a371d09a4381a47ef87f60aa2c79fcffff727ac2aee2"} Apr 22 19:58:44.413023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.412970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hjrqd" event={"ID":"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c","Type":"ContainerStarted","Data":"56b17dd45b74a44937fb01764f11f20c5585812d55063c61cb701e2b3cfb677e"} Apr 22 19:58:44.415549 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.415511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-52sgh" event={"ID":"a09841fa-cd66-4311-9794-2efee23e5727","Type":"ContainerStarted","Data":"4ea8edb04a4b3593664efce129aa982acc996307323436cd78f859acde36f93a"} Apr 22 19:58:44.422431 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.422317 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-160.ec2.internal" podStartSLOduration=2.422303758 podStartE2EDuration="2.422303758s" podCreationTimestamp="2026-04-22 19:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:44.421413212 +0000 UTC m=+3.618734097" watchObservedRunningTime="2026-04-22 19:58:44.422303758 +0000 UTC m=+3.619624644" Apr 22 19:58:44.841534 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.841454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:44.841534 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:44.841505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:44.841778 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.841655 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:44.841778 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.841675 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:44.841778 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.841688 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:44.841778 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.841742 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:46.841724107 +0000 UTC m=+6.039044989 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:44.842187 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.842152 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:44.842290 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:44.842219 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:46.842205331 +0000 UTC m=+6.039526193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:45.333059 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:45.332988 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:45.333518 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:45.333110 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:45.430541 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:45.430506 2575 generic.go:358] "Generic (PLEG): container finished" podID="31d5ae5fa3d2b4408ebb8dcb9abd1e84" containerID="d5bb0ec05326a1a619d3e4c5c7e69af5880b4120557ce18ffd00f03ef4fe8dc8" exitCode=0 Apr 22 19:58:45.431402 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:45.431373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" event={"ID":"31d5ae5fa3d2b4408ebb8dcb9abd1e84","Type":"ContainerDied","Data":"d5bb0ec05326a1a619d3e4c5c7e69af5880b4120557ce18ffd00f03ef4fe8dc8"} Apr 22 19:58:46.331101 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:46.331071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:46.331297 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.331213 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:46.440442 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:46.439697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" event={"ID":"31d5ae5fa3d2b4408ebb8dcb9abd1e84","Type":"ContainerStarted","Data":"fcf0f5eee3d9c031f9871499233302f1908daab111cdf2253f92848a2b8e0334"} Apr 22 19:58:46.856032 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:46.855994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:46.856223 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:46.856045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:46.856223 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.856167 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:46.856346 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.856244 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.856222807 +0000 UTC m=+10.053543692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:46.856346 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.856177 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:46.856346 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.856271 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:46.856346 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.856286 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:46.856346 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:46.856335 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.856320086 +0000 UTC m=+10.053640949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:47.331232 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:47.331155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:47.331381 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:47.331288 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:48.330305 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:48.330270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:48.330795 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:48.330422 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:49.330277 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:49.330245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:49.330434 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:49.330374 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:50.331063 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:50.331033 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:50.331476 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.331172 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:50.890670 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:50.890632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:50.890829 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:50.890690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:50.890932 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.890891 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:50.890932 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.890912 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:50.891046 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.890945 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:50.891046 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.890958 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:50.891046 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.890989 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:58.890969187 +0000 UTC m=+18.088290062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:50.891046 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:50.891020 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:58.891004848 +0000 UTC m=+18.088325716 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:51.331115 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:51.331032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:51.331532 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:51.331148 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:52.330662 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:52.330616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:52.330842 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:52.330753 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:53.330265 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:53.330240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:53.330681 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:53.330364 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:54.330415 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:54.330382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:54.330811 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:54.330510 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:55.330624 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:55.330588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:55.331145 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:55.330704 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:56.330854 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:56.330818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:56.331300 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:56.330967 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:57.330400 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:57.330370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:57.330570 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:57.330492 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:58:58.330104 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:58.330066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:58.330542 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.330187 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:58:58.952190 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:58.952155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:58:58.952378 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:58.952204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:58.952378 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.952307 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:58.952378 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.952377 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:14.952355307 +0000 UTC m=+34.149676168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:58.952548 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.952315 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:58.952548 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.952413 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:58.952548 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.952426 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:58.952548 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:58.952471 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:59:14.952459279 +0000 UTC m=+34.149780154 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:59.330486 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:58:59.330404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:58:59.330907 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:58:59.330516 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:00.330658 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:00.330625 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:00.330985 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:00.330727 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:01.331560 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.331408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:01.332118 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:01.331623 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:01.465610 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.465397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hrntb" event={"ID":"7047d9ee-3de6-4d53-a8d2-f3e0ea19b774","Type":"ContainerStarted","Data":"b74543e52716f3bf69b87a1012383b23bc8ba2f0e51aec94d1ae03a9f288654f"} Apr 22 19:59:01.466925 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.466893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" event={"ID":"23005632-886a-45ba-ac15-5a0a1282248f","Type":"ContainerStarted","Data":"f85d3db71366bcfbea6017c382608d409dead07caba37f9cd05244711e38e638"} Apr 22 19:59:01.468311 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.468286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" event={"ID":"0155db59-b48d-4919-97a6-8855d675375d","Type":"ContainerStarted","Data":"5dd0b5dbc224034bab6f7b7c792a006397977e903de6c7e0886cd0cdb4b0d096"} Apr 22 19:59:01.469765 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.469737 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd664ecc-6372-4249-9d3f-4ca9da5c429c" containerID="bc4be1659b1727c111dda01e4b4c60042539f45cf5bc35d77e636ec2601ed978" exitCode=0 Apr 22 19:59:01.469871 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.469807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerDied","Data":"bc4be1659b1727c111dda01e4b4c60042539f45cf5bc35d77e636ec2601ed978"} Apr 22 19:59:01.471438 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.471416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hjrqd" event={"ID":"0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c","Type":"ContainerStarted","Data":"7c6b1a02ce5911b85be711173ffdf437479630a5b55e3a4d17637a4f08d35d0a"} Apr 22 19:59:01.473039 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.473014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-52sgh" event={"ID":"a09841fa-cd66-4311-9794-2efee23e5727","Type":"ContainerStarted","Data":"b2bf5d442c3b53cb5ef9c6ab4831821ef3ca10b6fea9ccffa4f54667c9b8478e"} Apr 22 19:59:01.474446 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.474418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v7xx6" event={"ID":"ec02ec42-2641-4c07-aa33-0277c20a77a7","Type":"ContainerStarted","Data":"3124d23bbd819f51e2de5264f8c567e2c376b66df7c66421618ff6ecc1a47021"} Apr 22 19:59:01.476544 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.476524 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 19:59:01.476868 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.476836 2575 generic.go:358] "Generic (PLEG): container finished" podID="8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed" containerID="0b55553af85c9890508a5034ba749a9044ffe7764a799b6a5875b8b2df1ac13c" exitCode=1 Apr 22 19:59:01.476868 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.476864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"335ca4d273402096b527c91f333586f883b1be496ce68197f8144b26bbcc887a"} Apr 22 19:59:01.477023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.476881 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"a162b830d401b773e5c0e59095b238616695553cddb4332e07d12c65ee9d5a90"} Apr 22 19:59:01.477023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.476894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerDied","Data":"0b55553af85c9890508a5034ba749a9044ffe7764a799b6a5875b8b2df1ac13c"} Apr 22 19:59:01.477023 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.476904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"e662392b18fc4fcc4de70c2125c17af453e474fed26f32caae098c900891a183"} Apr 22 19:59:01.482762 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.482726 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hrntb" podStartSLOduration=3.547960908 podStartE2EDuration="20.482713879s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.848783733 +0000 UTC m=+3.046104597" lastFinishedPulling="2026-04-22 19:59:00.783536702 +0000 UTC m=+19.980857568" observedRunningTime="2026-04-22 19:59:01.482692875 +0000 UTC m=+20.680013769" watchObservedRunningTime="2026-04-22 19:59:01.482713879 +0000 UTC m=+20.680034766" Apr 22 19:59:01.483411 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.483380 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-160.ec2.internal" podStartSLOduration=19.483369795 podStartE2EDuration="19.483369795s" podCreationTimestamp="2026-04-22 19:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:46.45889864 +0000 UTC m=+5.656219524" watchObservedRunningTime="2026-04-22 19:59:01.483369795 +0000 UTC m=+20.680690684" Apr 22 19:59:01.499152 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.499095 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v7xx6" podStartSLOduration=3.541389089 podStartE2EDuration="20.499079948s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.839939419 +0000 UTC m=+3.037260283" lastFinishedPulling="2026-04-22 19:59:00.797630273 +0000 UTC m=+19.994951142" observedRunningTime="2026-04-22 19:59:01.498651181 +0000 UTC m=+20.695972066" watchObservedRunningTime="2026-04-22 19:59:01.499079948 +0000 UTC m=+20.696400832" Apr 22 19:59:01.512480 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.512437 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hjrqd" podStartSLOduration=11.698643777000001 podStartE2EDuration="20.512423734s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.84462607 +0000 UTC m=+3.041946931" lastFinishedPulling="2026-04-22 19:58:52.658406014 +0000 UTC m=+11.855726888" observedRunningTime="2026-04-22 19:59:01.512079838 +0000 UTC m=+20.709400723" watchObservedRunningTime="2026-04-22 19:59:01.512423734 +0000 UTC m=+20.709744618" Apr 22 19:59:01.547375 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.546556 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-52sgh" podStartSLOduration=3.606428854 podStartE2EDuration="20.546540725s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.84368293 +0000 UTC m=+3.041003805" lastFinishedPulling="2026-04-22 19:59:00.783794812 +0000 UTC m=+19.981115676" observedRunningTime="2026-04-22 19:59:01.546540105 +0000 UTC m=+20.743860988" watchObservedRunningTime="2026-04-22 19:59:01.546540725 +0000 UTC m=+20.743861607" Apr 22 19:59:01.572768 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:01.572673 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8wmlw" podStartSLOduration=3.6339638 podStartE2EDuration="20.572657335s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.846626262 +0000 UTC m=+3.043947132" lastFinishedPulling="2026-04-22 19:59:00.785319803 +0000 UTC m=+19.982640667" observedRunningTime="2026-04-22 19:59:01.572494397 +0000 UTC m=+20.769815281" watchObservedRunningTime="2026-04-22 19:59:01.572657335 +0000 UTC m=+20.769978217" Apr 22 19:59:02.220185 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.220140 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:59:02.276138 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.276027 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:59:02.220162514Z","UUID":"084b6f42-98d6-4d24-9341-4235d587018e","Handler":null,"Name":"","Endpoint":""} Apr 22 19:59:02.279013 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.278992 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:59:02.279125 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.279021 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:59:02.330302 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.330232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:02.330420 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:02.330355 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:02.480328 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.480287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5sz4q" event={"ID":"c3fdb762-05cf-4828-8c3e-45c50aca2528","Type":"ContainerStarted","Data":"75deb081ca68f9562d13ab30925af931e4cdf07ffc419294456ba3c4a513969c"} Apr 22 19:59:02.483243 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.483220 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 19:59:02.483597 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.483574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"a5a7d110b5e2fc7c4995a28fe46e09b3d24b37773a31a8d18ba9b8e8682ef500"} Apr 22 19:59:02.483676 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.483605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"25b523119376d9b7e84b1d056543b9ba73926bb7975cd3d6046dd3a54034a182"} Apr 22 19:59:02.485417 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.485330 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" event={"ID":"23005632-886a-45ba-ac15-5a0a1282248f","Type":"ContainerStarted","Data":"18b16a24e797e6ba646083eef497c0884151a1a339550543cd4ecbb930aaf4f7"} Apr 22 19:59:02.495541 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:02.495485 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5sz4q" podStartSLOduration=4.558396202 podStartE2EDuration="21.495473956s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.846892256 +0000 UTC m=+3.044213124" lastFinishedPulling="2026-04-22 19:59:00.783970017 +0000 UTC m=+19.981290878" observedRunningTime="2026-04-22 19:59:02.495273885 +0000 UTC m=+21.692594768" watchObservedRunningTime="2026-04-22 19:59:02.495473956 +0000 UTC m=+21.692794839" Apr 22 19:59:03.331222 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.331040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:03.331448 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:03.331318 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:03.363195 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.363165 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:59:03.363837 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.363817 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:59:03.488742 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.488711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" event={"ID":"23005632-886a-45ba-ac15-5a0a1282248f","Type":"ContainerStarted","Data":"18ec5d63bc2446c06960021ef718906e7c803e943a2c580bf4a55400e1a4c2d0"} Apr 22 19:59:03.489371 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.488970 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:59:03.489440 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.489390 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hjrqd" Apr 22 19:59:03.505507 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:03.505464 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5x25w" podStartSLOduration=3.071216909 podStartE2EDuration="22.505452433s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.847211995 +0000 UTC m=+3.044532873" lastFinishedPulling="2026-04-22 19:59:03.281447518 +0000 UTC m=+22.478768397" observedRunningTime="2026-04-22 19:59:03.505348885 +0000 UTC m=+22.702669767" watchObservedRunningTime="2026-04-22 19:59:03.505452433 +0000 UTC m=+22.702773316" Apr 22 19:59:04.330099 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:04.330069 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:04.330272 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:04.330203 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:04.494407 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:04.494372 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 19:59:04.496390 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:04.496364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"6c59a278b23927bb9a82fa85c1044449260922585b20707fcd50ed9c546aaa2b"} Apr 22 19:59:05.331134 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:05.331101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:05.331304 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:05.331201 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:06.331091 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.330865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:06.331642 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:06.331104 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:06.501349 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.501325 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 19:59:06.501713 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.501685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"7dfdf90f4e055977181ca960741163d300f40d6486e49194b7634198a071788b"} Apr 22 19:59:06.502016 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.501992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:59:06.502123 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.502024 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:59:06.502241 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.502221 2575 scope.go:117] "RemoveContainer" containerID="0b55553af85c9890508a5034ba749a9044ffe7764a799b6a5875b8b2df1ac13c" Apr 22 19:59:06.503484 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.503458 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd664ecc-6372-4249-9d3f-4ca9da5c429c" containerID="b0c458632bdb4b23e4febff95b57da833ac4e560d050418be5be80ae65e13127" exitCode=0 Apr 22 19:59:06.503568 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.503493 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerDied","Data":"b0c458632bdb4b23e4febff95b57da833ac4e560d050418be5be80ae65e13127"} Apr 22 19:59:06.517879 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:06.517853 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:59:07.331053 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.331020 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:07.331177 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:07.331140 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:07.508473 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.508446 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 19:59:07.508885 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.508851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" event={"ID":"8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed","Type":"ContainerStarted","Data":"289239ee16d904b842d04004da545bfb4e240da566ea022a172bd7b1bd09c324"} Apr 22 19:59:07.509312 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.509239 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:59:07.511159 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.511138 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd664ecc-6372-4249-9d3f-4ca9da5c429c" containerID="cfde60c07a629b611960ab216fba28a56e86a92e7e6f3a7b94afbe171b6ee36d" exitCode=0 Apr 22 19:59:07.511230 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.511178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerDied","Data":"cfde60c07a629b611960ab216fba28a56e86a92e7e6f3a7b94afbe171b6ee36d"} Apr 22 19:59:07.523475 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.523451 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:59:07.537000 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.536959 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" podStartSLOduration=9.544267228 podStartE2EDuration="26.536949063s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.839257359 +0000 UTC m=+3.036578221" lastFinishedPulling="2026-04-22 19:59:00.831939182 +0000 UTC m=+20.029260056" observedRunningTime="2026-04-22 19:59:07.535315573 +0000 UTC m=+26.732636456" watchObservedRunningTime="2026-04-22 19:59:07.536949063 +0000 UTC m=+26.734269946" Apr 22 19:59:07.823401 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.823346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9tw8t"] Apr 22 19:59:07.823510 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.823423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:07.823510 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:07.823496 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:07.826082 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.826061 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rw25k"] Apr 22 19:59:07.826190 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:07.826157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:07.826257 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:07.826239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:08.515235 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:08.515015 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd664ecc-6372-4249-9d3f-4ca9da5c429c" containerID="20812cd63f214d94c47d4abc508fd32da52794caf3ae6f95fb3d648166614d90" exitCode=0 Apr 22 19:59:08.515668 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:08.515063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerDied","Data":"20812cd63f214d94c47d4abc508fd32da52794caf3ae6f95fb3d648166614d90"} Apr 22 19:59:09.330716 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:09.330679 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:09.330908 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:09.330687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:09.330908 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:09.330805 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:09.330908 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:09.330892 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:11.331410 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:11.331381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:11.332048 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:11.331472 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:11.332048 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:11.331569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:11.332048 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:11.331692 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:13.330857 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.330825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:13.331328 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.330825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:13.331328 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:13.330967 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rw25k" podUID="7c3e7957-917d-44e3-8833-76ccc4a5d167" Apr 22 19:59:13.331328 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:13.330996 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9tw8t" podUID="dd53ea43-190e-42c7-b4f7-20127893755e" Apr 22 19:59:13.674367 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.674343 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-160.ec2.internal" event="NodeReady" Apr 22 19:59:13.674510 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.674497 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:59:13.720057 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.720030 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dvs95"] Apr 22 19:59:13.725189 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.725161 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lz5pr"] Apr 22 19:59:13.725326 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.725309 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.728166 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.728109 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:59:13.728166 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.728134 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tzbrr\"" Apr 22 19:59:13.728318 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.728249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:13.728484 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.728366 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:59:13.730579 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.730516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dvs95"] Apr 22 19:59:13.730697 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.730678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lxbw7\"" Apr 22 19:59:13.730784 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.730684 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:59:13.731357 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.731127 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:59:13.731357 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.731211 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:59:13.737699 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.737674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lz5pr"] Apr 22 19:59:13.860649 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.860622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddc58e3d-3248-48a6-990e-23900cf6ae7f-config-volume\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.860793 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.860668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:13.860793 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.860690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.860793 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.860740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fbn\" (UniqueName: \"kubernetes.io/projected/ddc58e3d-3248-48a6-990e-23900cf6ae7f-kube-api-access-74fbn\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.860793 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.860776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4pb7\" (UniqueName: \"kubernetes.io/projected/cfae59ae-9e91-4cd6-b825-4209ded69c88-kube-api-access-m4pb7\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:13.860932 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.860814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddc58e3d-3248-48a6-990e-23900cf6ae7f-tmp-dir\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.962113 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:13.962229 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.962229 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:13.962214 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:13.962229 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:13.962222 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:13.962386 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:13.962258 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 19:59:14.462243522 +0000 UTC m=+33.659564383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:13.962386 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:13.962276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:14.462266636 +0000 UTC m=+33.659587499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:13.962386 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74fbn\" (UniqueName: \"kubernetes.io/projected/ddc58e3d-3248-48a6-990e-23900cf6ae7f-kube-api-access-74fbn\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.962386 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4pb7\" (UniqueName: \"kubernetes.io/projected/cfae59ae-9e91-4cd6-b825-4209ded69c88-kube-api-access-m4pb7\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:13.962386 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddc58e3d-3248-48a6-990e-23900cf6ae7f-tmp-dir\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.962630 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddc58e3d-3248-48a6-990e-23900cf6ae7f-config-volume\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.962675 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.962657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddc58e3d-3248-48a6-990e-23900cf6ae7f-tmp-dir\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.970498 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.970466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddc58e3d-3248-48a6-990e-23900cf6ae7f-config-volume\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.973211 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.973191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fbn\" (UniqueName: \"kubernetes.io/projected/ddc58e3d-3248-48a6-990e-23900cf6ae7f-kube-api-access-74fbn\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:13.973307 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:13.973292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4pb7\" (UniqueName: \"kubernetes.io/projected/cfae59ae-9e91-4cd6-b825-4209ded69c88-kube-api-access-m4pb7\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:14.465964 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:14.465745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:14.465964 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:14.465945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:14.466422 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.465890 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:14.466422 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.466045 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:14.466422 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.466057 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:15.466041253 +0000 UTC m=+34.663362114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:14.466422 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.466081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 19:59:15.466070749 +0000 UTC m=+34.663391609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:14.529342 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:14.529317 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd664ecc-6372-4249-9d3f-4ca9da5c429c" containerID="8f0f90fc559a6242f4d120aa3a4601733e2030cea9a9cf0feb3136eff987640f" exitCode=0 Apr 22 19:59:14.529486 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:14.529357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerDied","Data":"8f0f90fc559a6242f4d120aa3a4601733e2030cea9a9cf0feb3136eff987640f"} Apr 22 19:59:14.970129 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:14.970097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:14.970257 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:14.970135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:14.970301 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.970252 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:59:14.970301 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.970282 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:59:14.970301 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.970295 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:59:14.970408 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.970308 2575 projected.go:194] Error preparing data for projected volume kube-api-access-286m4 for pod openshift-network-diagnostics/network-check-target-9tw8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:59:14.970408 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.970319 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:46.970301382 +0000 UTC m=+66.167622261 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:59:14.970408 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:14.970340 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4 podName:dd53ea43-190e-42c7-b4f7-20127893755e nodeName:}" failed. No retries permitted until 2026-04-22 19:59:46.970330614 +0000 UTC m=+66.167651474 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-286m4" (UniqueName: "kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4") pod "network-check-target-9tw8t" (UID: "dd53ea43-190e-42c7-b4f7-20127893755e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:59:15.330959 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.330909 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:15.331155 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.331133 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:15.333769 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.333749 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:15.334030 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.334013 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:15.334118 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.334059 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cbww9\"" Apr 22 19:59:15.334175 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.334141 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:15.335106 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.335089 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-79t2x\"" Apr 22 19:59:15.473501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.473480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:15.473789 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.473505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:15.473789 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:15.473583 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:15.473789 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:15.473678 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:17.473656214 +0000 UTC m=+36.670977074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:15.473789 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:15.473731 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:15.473789 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:15.473769 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 19:59:17.473759796 +0000 UTC m=+36.671080661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:15.533757 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.533738 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd664ecc-6372-4249-9d3f-4ca9da5c429c" containerID="2dfb14e94a8522e790ffc3272c4faa6e3633309fcd0055d80bb53c0f1279d9b3" exitCode=0 Apr 22 19:59:15.533839 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:15.533768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerDied","Data":"2dfb14e94a8522e790ffc3272c4faa6e3633309fcd0055d80bb53c0f1279d9b3"} Apr 22 19:59:16.537669 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:16.537639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" event={"ID":"bd664ecc-6372-4249-9d3f-4ca9da5c429c","Type":"ContainerStarted","Data":"318ae359b60e697017105df856227deb1c6d8bb51e30004950f38ef146542184"} Apr 22 19:59:16.561229 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:16.561187 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vn8pw" podStartSLOduration=5.464434034 podStartE2EDuration="35.561175656s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:43.842831991 +0000 UTC m=+3.040152859" lastFinishedPulling="2026-04-22 19:59:13.93957362 +0000 UTC m=+33.136894481" observedRunningTime="2026-04-22 19:59:16.560551752 +0000 UTC m=+35.757872672" watchObservedRunningTime="2026-04-22 19:59:16.561175656 +0000 UTC m=+35.758496539" Apr 22 19:59:17.486421 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:17.486387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:17.486561 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:17.486425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:17.486561 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:17.486527 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:17.486669 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:17.486571 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:17.486669 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:17.486597 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:21.486583184 +0000 UTC m=+40.683904045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:17.486669 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:17.486632 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 19:59:21.48661576 +0000 UTC m=+40.683936639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:21.512371 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:21.512342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:21.512371 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:21.512375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:21.512757 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:21.512480 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:21.512757 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:21.512491 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:21.512757 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:21.512526 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 19:59:29.512514187 +0000 UTC m=+48.709835048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:21.512757 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:21.512539 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:29.512533062 +0000 UTC m=+48.709853923 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:29.563900 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:29.563862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:29.563900 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:29.563903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:29.564447 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:29.564027 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:29.564447 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:29.564075 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:29.564447 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:29.564087 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:45.564070277 +0000 UTC m=+64.761391142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:29.564447 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:29.564134 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 19:59:45.564117605 +0000 UTC m=+64.761438473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:39.528217 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:39.528184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jw84m" Apr 22 19:59:45.566806 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:45.566776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 19:59:45.566806 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:45.566809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 19:59:45.567213 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:45.566907 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:45.567213 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:45.566909 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:45.567213 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:45.566982 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert podName:cfae59ae-9e91-4cd6-b825-4209ded69c88 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:17.566962537 +0000 UTC m=+96.764283417 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert") pod "ingress-canary-lz5pr" (UID: "cfae59ae-9e91-4cd6-b825-4209ded69c88") : secret "canary-serving-cert" not found Apr 22 19:59:45.567213 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:45.566998 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls podName:ddc58e3d-3248-48a6-990e-23900cf6ae7f nodeName:}" failed. No retries permitted until 2026-04-22 20:00:17.566991336 +0000 UTC m=+96.764312200 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls") pod "dns-default-dvs95" (UID: "ddc58e3d-3248-48a6-990e-23900cf6ae7f") : secret "dns-default-metrics-tls" not found Apr 22 19:59:46.975487 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:46.975451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 19:59:46.975487 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:46.975488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:46.978755 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:46.978737 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:46.978801 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:46.978750 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:46.986244 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:46.986225 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:59:46.986296 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:46.986286 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs podName:7c3e7957-917d-44e3-8833-76ccc4a5d167 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:50.986267887 +0000 UTC m=+130.183588768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs") pod "network-metrics-daemon-rw25k" (UID: "7c3e7957-917d-44e3-8833-76ccc4a5d167") : secret "metrics-daemon-secret" not found Apr 22 19:59:46.988777 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:46.988764 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:47.000434 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:47.000408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-286m4\" (UniqueName: \"kubernetes.io/projected/dd53ea43-190e-42c7-b4f7-20127893755e-kube-api-access-286m4\") pod \"network-check-target-9tw8t\" (UID: \"dd53ea43-190e-42c7-b4f7-20127893755e\") " pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:47.150225 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:47.148043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cbww9\"" Apr 22 19:59:47.154978 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:47.154962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:47.274053 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:47.274024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9tw8t"] Apr 22 19:59:47.277400 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:59:47.277373 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd53ea43_190e_42c7_b4f7_20127893755e.slice/crio-dfcad133f5889c7cd3f4bfb8621e6c9f51ab10e86d9a327a57cb3f9592391bd6 WatchSource:0}: Error finding container dfcad133f5889c7cd3f4bfb8621e6c9f51ab10e86d9a327a57cb3f9592391bd6: Status 404 returned error can't find the container with id dfcad133f5889c7cd3f4bfb8621e6c9f51ab10e86d9a327a57cb3f9592391bd6 Apr 22 19:59:47.594321 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:47.594229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9tw8t" event={"ID":"dd53ea43-190e-42c7-b4f7-20127893755e","Type":"ContainerStarted","Data":"dfcad133f5889c7cd3f4bfb8621e6c9f51ab10e86d9a327a57cb3f9592391bd6"} Apr 22 19:59:50.602073 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:50.602037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9tw8t" event={"ID":"dd53ea43-190e-42c7-b4f7-20127893755e","Type":"ContainerStarted","Data":"43f1c9d50b559ff1d337d68c4bb4034f21b67c535cb50d4fb8dde4ec168b3298"} Apr 22 19:59:50.602559 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:50.602167 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 19:59:50.618250 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:50.618207 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9tw8t" podStartSLOduration=66.99960038 podStartE2EDuration="1m9.618193666s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:59:47.279145122 +0000 UTC m=+66.476465989" lastFinishedPulling="2026-04-22 19:59:49.89773841 +0000 UTC m=+69.095059275" observedRunningTime="2026-04-22 19:59:50.617167196 +0000 UTC m=+69.814488080" watchObservedRunningTime="2026-04-22 19:59:50.618193666 +0000 UTC m=+69.815514548" Apr 22 19:59:51.563314 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.563275 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr"] Apr 22 19:59:51.567486 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.567454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.568073 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.568046 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-c4cc68cf4-99gts"] Apr 22 19:59:51.570123 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.570105 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:59:51.570219 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.570115 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:59:51.570452 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.570437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6cr7z\"" Apr 22 19:59:51.570562 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.570545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:59:51.570606 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.570559 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.573485 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.573468 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:59:51.573686 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.573663 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:59:51.573785 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.573759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:59:51.574124 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.573800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-m8xbs\"" Apr 22 19:59:51.574124 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.573866 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:59:51.574124 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.574033 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:59:51.574259 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.574155 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:59:51.574400 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.574379 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:59:51.577370 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.577351 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr"] Apr 22 19:59:51.581305 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.581284 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-c4cc68cf4-99gts"] Apr 22 19:59:51.602243 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9nnt\" (UniqueName: \"kubernetes.io/projected/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-kube-api-access-z9nnt\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-default-certificate\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wvj4\" (UniqueName: \"kubernetes.io/projected/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-kube-api-access-9wvj4\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-stats-auth\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.602539 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.602492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.703251 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.703346 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9nnt\" (UniqueName: \"kubernetes.io/projected/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-kube-api-access-z9nnt\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.703346 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-default-certificate\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.703421 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.703466 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.703522 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wvj4\" (UniqueName: \"kubernetes.io/projected/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-kube-api-access-9wvj4\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.703522 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-stats-auth\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.703614 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.703568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.703614 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:51.703579 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:51.703614 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:51.703592 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:52.203569398 +0000 UTC m=+71.400890276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:51.703761 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:51.703638 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls podName:8dfd4ef7-39de-4af8-be99-d4e588c62cf8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:52.203622904 +0000 UTC m=+71.400943766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d2pwr" (UID: "8dfd4ef7-39de-4af8-be99-d4e588c62cf8") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:51.703761 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:51.703639 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:51.703761 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:51.703682 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:52.203672948 +0000 UTC m=+71.400993810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : secret "router-metrics-certs-default" not found Apr 22 19:59:51.704421 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.704404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:51.705676 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.705660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-default-certificate\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.705721 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.705699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-stats-auth\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.711761 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.711737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9nnt\" (UniqueName: \"kubernetes.io/projected/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-kube-api-access-z9nnt\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:51.711887 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:51.711873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wvj4\" (UniqueName: \"kubernetes.io/projected/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-kube-api-access-9wvj4\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:52.207062 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:52.207026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:52.207211 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:52.207102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:52.207211 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:52.207131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:52.207211 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:52.207172 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:52.207354 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:52.207227 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:52.207354 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:52.207240 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:53.207221952 +0000 UTC m=+72.404542828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : secret "router-metrics-certs-default" not found Apr 22 19:59:52.207354 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:52.207258 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:53.207250334 +0000 UTC m=+72.404571202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:52.207354 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:52.207272 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls podName:8dfd4ef7-39de-4af8-be99-d4e588c62cf8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:53.207264142 +0000 UTC m=+72.404585005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d2pwr" (UID: "8dfd4ef7-39de-4af8-be99-d4e588c62cf8") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:53.214093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:53.214065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:53.214093 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:53.214097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:53.214501 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:53.214187 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:53.214501 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:53.214209 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:55.214192331 +0000 UTC m=+74.411513214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:53.214501 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:53.214238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:53.214501 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:53.214296 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls podName:8dfd4ef7-39de-4af8-be99-d4e588c62cf8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:55.214288689 +0000 UTC m=+74.411609550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d2pwr" (UID: "8dfd4ef7-39de-4af8-be99-d4e588c62cf8") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:53.214501 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:53.214338 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:53.214501 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:53.214389 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:55.214372388 +0000 UTC m=+74.411693253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : secret "router-metrics-certs-default" not found Apr 22 19:59:55.226573 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:55.226463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:55.226573 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:55.226523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:55.226573 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:55.226543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:55.227152 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:55.227030 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:55.227152 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:55.227062 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:59.227039087 +0000 UTC m=+78.424359963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:55.227152 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:55.227078 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:55.227152 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:55.227106 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:59.227088576 +0000 UTC m=+78.424409456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : secret "router-metrics-certs-default" not found Apr 22 19:59:55.227152 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:55.227128 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls podName:8dfd4ef7-39de-4af8-be99-d4e588c62cf8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:59.227115973 +0000 UTC m=+78.424436851 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d2pwr" (UID: "8dfd4ef7-39de-4af8-be99-d4e588c62cf8") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:56.432583 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:56.432557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hrntb_7047d9ee-3de6-4d53-a8d2-f3e0ea19b774/dns-node-resolver/0.log" Apr 22 19:59:57.232272 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:57.232241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-52sgh_a09841fa-cd66-4311-9794-2efee23e5727/node-ca/0.log" Apr 22 19:59:59.252268 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.252224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:59.252268 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.252269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 19:59:59.252783 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:59.252351 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:59.252783 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:59.252406 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:07.252381401 +0000 UTC m=+86.449702275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:59.252783 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.252467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 19:59:59.252783 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:59.252526 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls podName:8dfd4ef7-39de-4af8-be99-d4e588c62cf8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:07.252510732 +0000 UTC m=+86.449831593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d2pwr" (UID: "8dfd4ef7-39de-4af8-be99-d4e588c62cf8") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:59.252783 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:59.252590 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:59.252783 ip-10-0-128-160 kubenswrapper[2575]: E0422 19:59:59.252635 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:07.252622869 +0000 UTC m=+86.449943747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : secret "router-metrics-certs-default" not found Apr 22 19:59:59.546828 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.546761 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g"] Apr 22 19:59:59.549714 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.549699 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" Apr 22 19:59:59.552520 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.552498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:59.552520 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.552498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:59.553783 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.553769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-hjvqw\"" Apr 22 19:59:59.556663 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.556640 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g"] Apr 22 19:59:59.655003 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.654980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fxfl\" (UniqueName: \"kubernetes.io/projected/2d086995-4405-4443-9eb1-d1e194ba6492-kube-api-access-7fxfl\") pod \"volume-data-source-validator-7c6cbb6c87-tvm9g\" (UID: \"2d086995-4405-4443-9eb1-d1e194ba6492\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" Apr 22 19:59:59.755955 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.755934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fxfl\" (UniqueName: \"kubernetes.io/projected/2d086995-4405-4443-9eb1-d1e194ba6492-kube-api-access-7fxfl\") pod \"volume-data-source-validator-7c6cbb6c87-tvm9g\" (UID: \"2d086995-4405-4443-9eb1-d1e194ba6492\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" Apr 22 19:59:59.763407 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.763381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fxfl\" (UniqueName: \"kubernetes.io/projected/2d086995-4405-4443-9eb1-d1e194ba6492-kube-api-access-7fxfl\") pod \"volume-data-source-validator-7c6cbb6c87-tvm9g\" (UID: \"2d086995-4405-4443-9eb1-d1e194ba6492\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" Apr 22 19:59:59.858460 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.858408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" Apr 22 19:59:59.987135 ip-10-0-128-160 kubenswrapper[2575]: I0422 19:59:59.987105 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g"] Apr 22 19:59:59.990046 ip-10-0-128-160 kubenswrapper[2575]: W0422 19:59:59.990018 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d086995_4405_4443_9eb1_d1e194ba6492.slice/crio-6844b2ca8a0db26e6168ed40aedfc8fcbecc9d263136ee65b0e59f087eb99f97 WatchSource:0}: Error finding container 6844b2ca8a0db26e6168ed40aedfc8fcbecc9d263136ee65b0e59f087eb99f97: Status 404 returned error can't find the container with id 6844b2ca8a0db26e6168ed40aedfc8fcbecc9d263136ee65b0e59f087eb99f97 Apr 22 20:00:00.452052 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.452021 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4v6ps"] Apr 22 20:00:00.455302 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.455280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.458197 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.458171 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 20:00:00.458197 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.458190 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 20:00:00.458367 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.458262 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 20:00:00.459294 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.459271 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 20:00:00.459411 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.459329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vqmv4\"" Apr 22 20:00:00.464820 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.464711 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 20:00:00.465903 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.465881 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4v6ps"] Apr 22 20:00:00.561534 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.561511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d677224f-4cd1-4503-9169-7d8613c67e70-trusted-ca\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.561689 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.561563 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d677224f-4cd1-4503-9169-7d8613c67e70-serving-cert\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.561689 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.561625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d677224f-4cd1-4503-9169-7d8613c67e70-config\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.561798 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.561686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k485g\" (UniqueName: \"kubernetes.io/projected/d677224f-4cd1-4503-9169-7d8613c67e70-kube-api-access-k485g\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.623085 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.623050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" event={"ID":"2d086995-4405-4443-9eb1-d1e194ba6492","Type":"ContainerStarted","Data":"6844b2ca8a0db26e6168ed40aedfc8fcbecc9d263136ee65b0e59f087eb99f97"} Apr 22 20:00:00.662853 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.662822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d677224f-4cd1-4503-9169-7d8613c67e70-trusted-ca\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.663003 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.662888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d677224f-4cd1-4503-9169-7d8613c67e70-serving-cert\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.663003 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.662943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d677224f-4cd1-4503-9169-7d8613c67e70-config\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.663003 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.662994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k485g\" (UniqueName: \"kubernetes.io/projected/d677224f-4cd1-4503-9169-7d8613c67e70-kube-api-access-k485g\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.663702 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.663682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d677224f-4cd1-4503-9169-7d8613c67e70-config\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.663788 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.663723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d677224f-4cd1-4503-9169-7d8613c67e70-trusted-ca\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.665503 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.665482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d677224f-4cd1-4503-9169-7d8613c67e70-serving-cert\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.672296 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.672275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k485g\" (UniqueName: \"kubernetes.io/projected/d677224f-4cd1-4503-9169-7d8613c67e70-kube-api-access-k485g\") pod \"console-operator-9d4b6777b-4v6ps\" (UID: \"d677224f-4cd1-4503-9169-7d8613c67e70\") " pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.766612 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.766550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:00.887532 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:00.887506 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4v6ps"] Apr 22 20:00:01.107516 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:01.107444 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd677224f_4cd1_4503_9169_7d8613c67e70.slice/crio-15f1d1a946441034672e7b6520229775ba7847dfbe7e13b75db682409448fa04 WatchSource:0}: Error finding container 15f1d1a946441034672e7b6520229775ba7847dfbe7e13b75db682409448fa04: Status 404 returned error can't find the container with id 15f1d1a946441034672e7b6520229775ba7847dfbe7e13b75db682409448fa04 Apr 22 20:00:01.625557 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:01.625520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" event={"ID":"d677224f-4cd1-4503-9169-7d8613c67e70","Type":"ContainerStarted","Data":"15f1d1a946441034672e7b6520229775ba7847dfbe7e13b75db682409448fa04"} Apr 22 20:00:01.626767 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:01.626745 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" event={"ID":"2d086995-4405-4443-9eb1-d1e194ba6492","Type":"ContainerStarted","Data":"60383d7dae1ca214f4ce78e3632bb883c18c9e739af9266713e1183f26e3d899"} Apr 22 20:00:01.642427 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:01.642362 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tvm9g" podStartSLOduration=1.482573796 podStartE2EDuration="2.642348908s" podCreationTimestamp="2026-04-22 19:59:59 +0000 UTC" firstStartedPulling="2026-04-22 19:59:59.991774985 +0000 UTC m=+79.189095849" lastFinishedPulling="2026-04-22 20:00:01.151550085 +0000 UTC m=+80.348870961" observedRunningTime="2026-04-22 20:00:01.641386367 +0000 UTC m=+80.838707252" watchObservedRunningTime="2026-04-22 20:00:01.642348908 +0000 UTC m=+80.839669791" Apr 22 20:00:03.632594 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:03.632567 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/0.log" Apr 22 20:00:03.632968 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:03.632607 2575 generic.go:358] "Generic (PLEG): container finished" podID="d677224f-4cd1-4503-9169-7d8613c67e70" containerID="1187c0d58bbcd34129a103d74b38800aaf44c13b0bf66b872d86a8ee8cf264d5" exitCode=255 Apr 22 20:00:03.632968 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:03.632649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" event={"ID":"d677224f-4cd1-4503-9169-7d8613c67e70","Type":"ContainerDied","Data":"1187c0d58bbcd34129a103d74b38800aaf44c13b0bf66b872d86a8ee8cf264d5"} Apr 22 20:00:03.632968 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:03.632871 2575 scope.go:117] "RemoveContainer" containerID="1187c0d58bbcd34129a103d74b38800aaf44c13b0bf66b872d86a8ee8cf264d5" Apr 22 20:00:04.635836 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:04.635806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/1.log" Apr 22 20:00:04.636228 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:04.636220 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/0.log" Apr 22 20:00:04.636282 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:04.636256 2575 generic.go:358] "Generic (PLEG): container finished" podID="d677224f-4cd1-4503-9169-7d8613c67e70" containerID="f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723" exitCode=255 Apr 22 20:00:04.636347 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:04.636330 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" event={"ID":"d677224f-4cd1-4503-9169-7d8613c67e70","Type":"ContainerDied","Data":"f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723"} Apr 22 20:00:04.636398 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:04.636386 2575 scope.go:117] "RemoveContainer" containerID="1187c0d58bbcd34129a103d74b38800aaf44c13b0bf66b872d86a8ee8cf264d5" Apr 22 20:00:04.636588 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:04.636571 2575 scope.go:117] "RemoveContainer" containerID="f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723" Apr 22 20:00:04.636758 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:04.636742 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4v6ps_openshift-console-operator(d677224f-4cd1-4503-9169-7d8613c67e70)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podUID="d677224f-4cd1-4503-9169-7d8613c67e70" Apr 22 20:00:05.639815 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:05.639788 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/1.log" Apr 22 20:00:05.640207 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:05.640101 2575 scope.go:117] "RemoveContainer" containerID="f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723" Apr 22 20:00:05.640273 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:05.640257 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4v6ps_openshift-console-operator(d677224f-4cd1-4503-9169-7d8613c67e70)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podUID="d677224f-4cd1-4503-9169-7d8613c67e70" Apr 22 20:00:07.314578 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:07.314540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:07.314602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:07.314622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:07.314699 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:07.314712 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:07.314741 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:23.314723088 +0000 UTC m=+102.512043949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : configmap references non-existent config key: service-ca.crt Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:07.314761 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls podName:8dfd4ef7-39de-4af8-be99-d4e588c62cf8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:23.314754508 +0000 UTC m=+102.512075369 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-d2pwr" (UID: "8dfd4ef7-39de-4af8-be99-d4e588c62cf8") : secret "cluster-monitoring-operator-tls" not found Apr 22 20:00:07.315014 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:07.314773 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs podName:eb61edaa-0f4d-4c21-a6c8-60949c33b5d1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:23.314766575 +0000 UTC m=+102.512087436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs") pod "router-default-c4cc68cf4-99gts" (UID: "eb61edaa-0f4d-4c21-a6c8-60949c33b5d1") : secret "router-metrics-certs-default" not found Apr 22 20:00:08.083661 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.083626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dj9hh"] Apr 22 20:00:08.087858 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.087842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.090681 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.090652 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 20:00:08.090789 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.090746 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 20:00:08.090870 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.090853 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5prhs\"" Apr 22 20:00:08.091998 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.091976 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 20:00:08.092101 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.092015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 20:00:08.095151 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.095132 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dj9hh"] Apr 22 20:00:08.121059 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.121034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.121140 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.121088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfgm2\" (UniqueName: \"kubernetes.io/projected/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-kube-api-access-vfgm2\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.121140 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.121128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-crio-socket\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.121206 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.121188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.121238 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.121210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-data-volume\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.221773 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.221741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.221948 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.221787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-data-volume\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.221948 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.221848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.221948 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:08.221895 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.221948 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.221912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfgm2\" (UniqueName: \"kubernetes.io/projected/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-kube-api-access-vfgm2\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.222135 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.221964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-crio-socket\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.222135 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:08.221987 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls podName:b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf nodeName:}" failed. No retries permitted until 2026-04-22 20:00:08.721966086 +0000 UTC m=+87.919286951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dj9hh" (UID: "b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.222135 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.222027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-crio-socket\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.222245 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.222187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-data-volume\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.222373 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.222356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.231182 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.231162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfgm2\" (UniqueName: \"kubernetes.io/projected/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-kube-api-access-vfgm2\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.724264 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:08.724236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:08.724603 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:08.724338 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.724603 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:08.724390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls podName:b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf nodeName:}" failed. No retries permitted until 2026-04-22 20:00:09.724376567 +0000 UTC m=+88.921697428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dj9hh" (UID: "b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:09.731685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:09.731655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:09.732052 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:09.731748 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:09.732052 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:09.731805 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls podName:b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf nodeName:}" failed. No retries permitted until 2026-04-22 20:00:11.731791321 +0000 UTC m=+90.929112181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dj9hh" (UID: "b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:10.767068 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:10.767026 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:10.767068 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:10.767073 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:10.767460 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:10.767383 2575 scope.go:117] "RemoveContainer" containerID="f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723" Apr 22 20:00:10.767545 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:10.767527 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4v6ps_openshift-console-operator(d677224f-4cd1-4503-9169-7d8613c67e70)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podUID="d677224f-4cd1-4503-9169-7d8613c67e70" Apr 22 20:00:11.747544 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:11.747513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:11.747680 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:11.747625 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:11.747680 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:11.747679 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls podName:b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf nodeName:}" failed. No retries permitted until 2026-04-22 20:00:15.747661348 +0000 UTC m=+94.944982228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dj9hh" (UID: "b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:15.773302 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:15.773269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:15.773667 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:15.773414 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:15.773667 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:15.773474 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls podName:b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf nodeName:}" failed. No retries permitted until 2026-04-22 20:00:23.773455351 +0000 UTC m=+102.970776219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dj9hh" (UID: "b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:17.585569 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.585533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 20:00:17.585569 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.585571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 20:00:17.587678 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.587657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfae59ae-9e91-4cd6-b825-4209ded69c88-cert\") pod \"ingress-canary-lz5pr\" (UID: \"cfae59ae-9e91-4cd6-b825-4209ded69c88\") " pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 20:00:17.588153 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.588125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc58e3d-3248-48a6-990e-23900cf6ae7f-metrics-tls\") pod \"dns-default-dvs95\" (UID: \"ddc58e3d-3248-48a6-990e-23900cf6ae7f\") " pod="openshift-dns/dns-default-dvs95" Apr 22 20:00:17.641436 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.641390 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tzbrr\"" Apr 22 20:00:17.647911 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.647895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lxbw7\"" Apr 22 20:00:17.648955 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.648939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dvs95" Apr 22 20:00:17.656051 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.656031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lz5pr" Apr 22 20:00:17.770326 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.770300 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dvs95"] Apr 22 20:00:17.773100 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:17.773074 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddc58e3d_3248_48a6_990e_23900cf6ae7f.slice/crio-e6d4b80b735cb293189524dd8a9536853ed179d8701efc6a38df188f847af795 WatchSource:0}: Error finding container e6d4b80b735cb293189524dd8a9536853ed179d8701efc6a38df188f847af795: Status 404 returned error can't find the container with id e6d4b80b735cb293189524dd8a9536853ed179d8701efc6a38df188f847af795 Apr 22 20:00:17.785397 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:17.785370 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lz5pr"] Apr 22 20:00:17.789288 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:17.789252 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfae59ae_9e91_4cd6_b825_4209ded69c88.slice/crio-51c2d69f68c41069011e884c587758188ccca922ea8f94eed650e056048a569c WatchSource:0}: Error finding container 51c2d69f68c41069011e884c587758188ccca922ea8f94eed650e056048a569c: Status 404 returned error can't find the container with id 51c2d69f68c41069011e884c587758188ccca922ea8f94eed650e056048a569c Apr 22 20:00:18.664739 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:18.664699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dvs95" event={"ID":"ddc58e3d-3248-48a6-990e-23900cf6ae7f","Type":"ContainerStarted","Data":"e6d4b80b735cb293189524dd8a9536853ed179d8701efc6a38df188f847af795"} Apr 22 20:00:18.666063 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:18.666022 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lz5pr" event={"ID":"cfae59ae-9e91-4cd6-b825-4209ded69c88","Type":"ContainerStarted","Data":"51c2d69f68c41069011e884c587758188ccca922ea8f94eed650e056048a569c"} Apr 22 20:00:20.671724 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:20.671681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dvs95" event={"ID":"ddc58e3d-3248-48a6-990e-23900cf6ae7f","Type":"ContainerStarted","Data":"b031916f74df8ac08a60b4592551125462926374c440c09df431284a75f595fb"} Apr 22 20:00:20.671724 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:20.671721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dvs95" event={"ID":"ddc58e3d-3248-48a6-990e-23900cf6ae7f","Type":"ContainerStarted","Data":"c458825419ae0267e71d94fad721bd09941d563d59ce42059866ad04d61184c9"} Apr 22 20:00:20.672195 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:20.671758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dvs95" Apr 22 20:00:20.672944 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:20.672909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lz5pr" event={"ID":"cfae59ae-9e91-4cd6-b825-4209ded69c88","Type":"ContainerStarted","Data":"83086810291d4f59aa763a671a8475048c5814684ed8a5084fa680174ef1f783"} Apr 22 20:00:20.689557 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:20.689513 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dvs95" podStartSLOduration=65.797481037 podStartE2EDuration="1m7.689499379s" podCreationTimestamp="2026-04-22 19:59:13 +0000 UTC" firstStartedPulling="2026-04-22 20:00:17.774819543 +0000 UTC m=+96.972140409" lastFinishedPulling="2026-04-22 20:00:19.666837891 +0000 UTC m=+98.864158751" observedRunningTime="2026-04-22 20:00:20.687442538 +0000 UTC m=+99.884763421" watchObservedRunningTime="2026-04-22 20:00:20.689499379 +0000 UTC m=+99.886820262" Apr 22 20:00:20.700938 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:20.700890 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lz5pr" podStartSLOduration=65.821636193 podStartE2EDuration="1m7.700880128s" podCreationTimestamp="2026-04-22 19:59:13 +0000 UTC" firstStartedPulling="2026-04-22 20:00:17.791009874 +0000 UTC m=+96.988330738" lastFinishedPulling="2026-04-22 20:00:19.670253805 +0000 UTC m=+98.867574673" observedRunningTime="2026-04-22 20:00:20.700810493 +0000 UTC m=+99.898131388" watchObservedRunningTime="2026-04-22 20:00:20.700880128 +0000 UTC m=+99.898201012" Apr 22 20:00:21.606034 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:21.606004 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9tw8t" Apr 22 20:00:23.329872 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.329842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:23.330244 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.329909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:23.330244 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.329940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 20:00:23.330652 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.330626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-service-ca-bundle\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:23.332365 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.332344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb61edaa-0f4d-4c21-a6c8-60949c33b5d1-metrics-certs\") pod \"router-default-c4cc68cf4-99gts\" (UID: \"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1\") " pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:23.332442 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.332380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfd4ef7-39de-4af8-be99-d4e588c62cf8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-d2pwr\" (UID: \"8dfd4ef7-39de-4af8-be99-d4e588c62cf8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 20:00:23.377253 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.377232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" Apr 22 20:00:23.383868 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.383798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:23.498956 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.498931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr"] Apr 22 20:00:23.501243 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:23.501215 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dfd4ef7_39de_4af8_be99_d4e588c62cf8.slice/crio-4cacce744e3e36ff2b960ff6c6f5b2d81106b5d0f5b9e922e2c48d337ea782ea WatchSource:0}: Error finding container 4cacce744e3e36ff2b960ff6c6f5b2d81106b5d0f5b9e922e2c48d337ea782ea: Status 404 returned error can't find the container with id 4cacce744e3e36ff2b960ff6c6f5b2d81106b5d0f5b9e922e2c48d337ea782ea Apr 22 20:00:23.516799 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.516779 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-c4cc68cf4-99gts"] Apr 22 20:00:23.519243 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:23.519219 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb61edaa_0f4d_4c21_a6c8_60949c33b5d1.slice/crio-0ac19a27647223eac93af479215d7fdf7eb6228cc6e930e30adeb736885088a2 WatchSource:0}: Error finding container 0ac19a27647223eac93af479215d7fdf7eb6228cc6e930e30adeb736885088a2: Status 404 returned error can't find the container with id 0ac19a27647223eac93af479215d7fdf7eb6228cc6e930e30adeb736885088a2 Apr 22 20:00:23.681817 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.681783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-c4cc68cf4-99gts" event={"ID":"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1","Type":"ContainerStarted","Data":"8d904ac389e1cdc80de1c24e6f03cfa3cd7da5a57a08c23bac80348a98ffd4e3"} Apr 22 20:00:23.681995 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.681825 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-c4cc68cf4-99gts" event={"ID":"eb61edaa-0f4d-4c21-a6c8-60949c33b5d1","Type":"ContainerStarted","Data":"0ac19a27647223eac93af479215d7fdf7eb6228cc6e930e30adeb736885088a2"} Apr 22 20:00:23.682789 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.682765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" event={"ID":"8dfd4ef7-39de-4af8-be99-d4e588c62cf8","Type":"ContainerStarted","Data":"4cacce744e3e36ff2b960ff6c6f5b2d81106b5d0f5b9e922e2c48d337ea782ea"} Apr 22 20:00:23.698522 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.698486 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-c4cc68cf4-99gts" podStartSLOduration=32.698474034 podStartE2EDuration="32.698474034s" podCreationTimestamp="2026-04-22 19:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:23.698352756 +0000 UTC m=+102.895673639" watchObservedRunningTime="2026-04-22 20:00:23.698474034 +0000 UTC m=+102.895794914" Apr 22 20:00:23.833792 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.833733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:23.835775 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.835756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dj9hh\" (UID: \"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf\") " pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:23.998038 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:23.997997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dj9hh" Apr 22 20:00:24.128907 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.128839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dj9hh"] Apr 22 20:00:24.133079 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:24.133054 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2af54cc_d1fd_48ec_b7ae_3c41f6c630cf.slice/crio-c84e41eaadfae2690bb895cd6916816a4571af69a639974a5f8d5dd8da2f3759 WatchSource:0}: Error finding container c84e41eaadfae2690bb895cd6916816a4571af69a639974a5f8d5dd8da2f3759: Status 404 returned error can't find the container with id c84e41eaadfae2690bb895cd6916816a4571af69a639974a5f8d5dd8da2f3759 Apr 22 20:00:24.384211 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.384134 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:24.387114 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.387089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:24.686999 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.686966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dj9hh" event={"ID":"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf","Type":"ContainerStarted","Data":"3d50f341209f9a9264049291adafeb1b06742ed240faba36bca698661aa8c7dc"} Apr 22 20:00:24.687187 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.687007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dj9hh" event={"ID":"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf","Type":"ContainerStarted","Data":"c84e41eaadfae2690bb895cd6916816a4571af69a639974a5f8d5dd8da2f3759"} Apr 22 20:00:24.687268 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.687255 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:24.688642 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:24.688619 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-c4cc68cf4-99gts" Apr 22 20:00:25.330674 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.330651 2575 scope.go:117] "RemoveContainer" containerID="f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723" Apr 22 20:00:25.691775 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.691739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dj9hh" event={"ID":"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf","Type":"ContainerStarted","Data":"f904582d3af55e8c69be6baa360b1aef6ac180d875b7a4a45618b55c35831c80"} Apr 22 20:00:25.693463 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.693442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:00:25.693894 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.693876 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/1.log" Apr 22 20:00:25.694033 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.693929 2575 generic.go:358] "Generic (PLEG): container finished" podID="d677224f-4cd1-4503-9169-7d8613c67e70" containerID="1d578c39360e2e7db9b028675452b033fe4aad1f01280c8492bf850caa567866" exitCode=255 Apr 22 20:00:25.694033 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.693954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" event={"ID":"d677224f-4cd1-4503-9169-7d8613c67e70","Type":"ContainerDied","Data":"1d578c39360e2e7db9b028675452b033fe4aad1f01280c8492bf850caa567866"} Apr 22 20:00:25.694033 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.693995 2575 scope.go:117] "RemoveContainer" containerID="f6321043a2b8759226348bdc2303b28010e408e9c044df22eddab672456c9723" Apr 22 20:00:25.694341 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.694323 2575 scope.go:117] "RemoveContainer" containerID="1d578c39360e2e7db9b028675452b033fe4aad1f01280c8492bf850caa567866" Apr 22 20:00:25.694561 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:25.694535 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-4v6ps_openshift-console-operator(d677224f-4cd1-4503-9169-7d8613c67e70)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podUID="d677224f-4cd1-4503-9169-7d8613c67e70" Apr 22 20:00:25.695519 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.695481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" event={"ID":"8dfd4ef7-39de-4af8-be99-d4e588c62cf8","Type":"ContainerStarted","Data":"e1b8d99c5dbb520d8e160b99986358a8d9510ece8713ef8d9f71a62a10539402"} Apr 22 20:00:25.722782 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:25.722739 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-d2pwr" podStartSLOduration=33.312416255 podStartE2EDuration="34.722728207s" podCreationTimestamp="2026-04-22 19:59:51 +0000 UTC" firstStartedPulling="2026-04-22 20:00:23.503092382 +0000 UTC m=+102.700413245" lastFinishedPulling="2026-04-22 20:00:24.913404336 +0000 UTC m=+104.110725197" observedRunningTime="2026-04-22 20:00:25.72180066 +0000 UTC m=+104.919121546" watchObservedRunningTime="2026-04-22 20:00:25.722728207 +0000 UTC m=+104.920049089" Apr 22 20:00:26.700476 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:26.700425 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:00:26.702122 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:26.702097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dj9hh" event={"ID":"b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf","Type":"ContainerStarted","Data":"8f0f299e25f682cce4d3960d201f1d8f826a5460cec5250988901a09fd45b959"} Apr 22 20:00:26.727005 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:26.726961 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dj9hh" podStartSLOduration=16.367012026 podStartE2EDuration="18.726948415s" podCreationTimestamp="2026-04-22 20:00:08 +0000 UTC" firstStartedPulling="2026-04-22 20:00:24.192520535 +0000 UTC m=+103.389841402" lastFinishedPulling="2026-04-22 20:00:26.552456925 +0000 UTC m=+105.749777791" observedRunningTime="2026-04-22 20:00:26.725011171 +0000 UTC m=+105.922332053" watchObservedRunningTime="2026-04-22 20:00:26.726948415 +0000 UTC m=+105.924269298" Apr 22 20:00:29.432580 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.432549 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dhg8l"] Apr 22 20:00:29.434627 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.434612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.438698 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.438674 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 20:00:29.438698 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.438687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 20:00:29.438698 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.438674 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 20:00:29.438913 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.438674 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zjqfn\"" Apr 22 20:00:29.443435 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.443415 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dhg8l"] Apr 22 20:00:29.475826 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.475808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ddc495d-0284-42b1-b944-aa586284e1fa-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.475943 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.475836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ddc495d-0284-42b1-b944-aa586284e1fa-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.475943 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.475856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ddc495d-0284-42b1-b944-aa586284e1fa-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.476019 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.475978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9585\" (UniqueName: \"kubernetes.io/projected/5ddc495d-0284-42b1-b944-aa586284e1fa-kube-api-access-w9585\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.577109 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.577086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9585\" (UniqueName: \"kubernetes.io/projected/5ddc495d-0284-42b1-b944-aa586284e1fa-kube-api-access-w9585\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.577221 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.577131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ddc495d-0284-42b1-b944-aa586284e1fa-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.577221 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.577150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ddc495d-0284-42b1-b944-aa586284e1fa-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.577221 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.577169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ddc495d-0284-42b1-b944-aa586284e1fa-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.577712 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.577690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ddc495d-0284-42b1-b944-aa586284e1fa-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.579518 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.579500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ddc495d-0284-42b1-b944-aa586284e1fa-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.579987 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.579967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ddc495d-0284-42b1-b944-aa586284e1fa-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.586315 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.586297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9585\" (UniqueName: \"kubernetes.io/projected/5ddc495d-0284-42b1-b944-aa586284e1fa-kube-api-access-w9585\") pod \"prometheus-operator-5676c8c784-dhg8l\" (UID: \"5ddc495d-0284-42b1-b944-aa586284e1fa\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.743280 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.743232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" Apr 22 20:00:29.852145 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:29.852118 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dhg8l"] Apr 22 20:00:29.854903 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:29.854873 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ddc495d_0284_42b1_b944_aa586284e1fa.slice/crio-f1806c42185140c67b3b226f5d7aa0d545486c025d41c18f0dc098ee5a6a7c0c WatchSource:0}: Error finding container f1806c42185140c67b3b226f5d7aa0d545486c025d41c18f0dc098ee5a6a7c0c: Status 404 returned error can't find the container with id f1806c42185140c67b3b226f5d7aa0d545486c025d41c18f0dc098ee5a6a7c0c Apr 22 20:00:30.678659 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:30.678628 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dvs95" Apr 22 20:00:30.714196 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:30.714166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" event={"ID":"5ddc495d-0284-42b1-b944-aa586284e1fa","Type":"ContainerStarted","Data":"f1806c42185140c67b3b226f5d7aa0d545486c025d41c18f0dc098ee5a6a7c0c"} Apr 22 20:00:30.767097 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:30.767069 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:30.767097 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:30.767103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:30.767468 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:30.767452 2575 scope.go:117] "RemoveContainer" containerID="1d578c39360e2e7db9b028675452b033fe4aad1f01280c8492bf850caa567866" Apr 22 20:00:30.767655 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:30.767637 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-4v6ps_openshift-console-operator(d677224f-4cd1-4503-9169-7d8613c67e70)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podUID="d677224f-4cd1-4503-9169-7d8613c67e70" Apr 22 20:00:31.721771 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:31.721732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" event={"ID":"5ddc495d-0284-42b1-b944-aa586284e1fa","Type":"ContainerStarted","Data":"230897b921b84913baffb4755f6ccf067bad0b9fdd6deaab63d2fff133ff5c2f"} Apr 22 20:00:31.721771 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:31.721773 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" event={"ID":"5ddc495d-0284-42b1-b944-aa586284e1fa","Type":"ContainerStarted","Data":"7abdfa85873118520e340385dafacaa5850dd8c454913c7951dee95a6901267f"} Apr 22 20:00:31.738329 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:31.738277 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhg8l" podStartSLOduration=1.620568543 podStartE2EDuration="2.738263672s" podCreationTimestamp="2026-04-22 20:00:29 +0000 UTC" firstStartedPulling="2026-04-22 20:00:29.856798278 +0000 UTC m=+109.054119143" lastFinishedPulling="2026-04-22 20:00:30.974493395 +0000 UTC m=+110.171814272" observedRunningTime="2026-04-22 20:00:31.737678466 +0000 UTC m=+110.934999350" watchObservedRunningTime="2026-04-22 20:00:31.738263672 +0000 UTC m=+110.935584572" Apr 22 20:00:33.763498 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.763469 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5"] Apr 22 20:00:33.765517 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.765499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.768082 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.768058 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 20:00:33.768264 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.768244 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-l5d8m\"" Apr 22 20:00:33.768374 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.768357 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 20:00:33.775974 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.775952 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5"] Apr 22 20:00:33.794555 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.794527 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f77r8"] Apr 22 20:00:33.796704 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.796685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.799418 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.799386 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vfgt6\"" Apr 22 20:00:33.799577 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.799388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 20:00:33.799577 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.799464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 20:00:33.799727 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.799711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 20:00:33.806928 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.806891 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9187cac0-ce5c-42b4-93e0-7efd963a5632-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.807033 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.807009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.807095 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.807045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrcb\" (UniqueName: \"kubernetes.io/projected/9187cac0-ce5c-42b4-93e0-7efd963a5632-kube-api-access-xwrcb\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.807095 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.807077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.908316 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-wtmp\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908316 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76pn\" (UniqueName: \"kubernetes.io/projected/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-kube-api-access-w76pn\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908539 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-accelerators-collector-config\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908539 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-metrics-client-ca\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908539 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908539 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-root\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-textfile\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.908685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrcb\" (UniqueName: \"kubernetes.io/projected/9187cac0-ce5c-42b4-93e0-7efd963a5632-kube-api-access-xwrcb\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.908685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.908685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-sys\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908902 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9187cac0-ce5c-42b4-93e0-7efd963a5632-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.908902 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.908717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-tls\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:33.908902 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:33.908791 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 20:00:33.908902 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:33.908865 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-tls podName:9187cac0-ce5c-42b4-93e0-7efd963a5632 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:34.408845045 +0000 UTC m=+113.606165920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vsgb5" (UID: "9187cac0-ce5c-42b4-93e0-7efd963a5632") : secret "openshift-state-metrics-tls" not found Apr 22 20:00:33.909357 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.909339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9187cac0-ce5c-42b4-93e0-7efd963a5632-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.911374 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.911358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:33.919005 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:33.918910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrcb\" (UniqueName: \"kubernetes.io/projected/9187cac0-ce5c-42b4-93e0-7efd963a5632-kube-api-access-xwrcb\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:34.010049 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-root\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010189 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-textfile\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010189 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-sys\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010189 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-tls\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010189 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-wtmp\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010189 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-sys\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010189 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w76pn\" (UniqueName: \"kubernetes.io/projected/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-kube-api-access-w76pn\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-root\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-accelerators-collector-config\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:34.010235 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-metrics-client-ca\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:34.010299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-tls podName:f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:34.510278105 +0000 UTC m=+113.707598985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-tls") pod "node-exporter-f77r8" (UID: "f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7") : secret "node-exporter-tls" not found Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-wtmp\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010475 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-textfile\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.010971 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-metrics-client-ca\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.011031 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.010986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-accelerators-collector-config\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.012341 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.012322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.024257 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.024205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76pn\" (UniqueName: \"kubernetes.io/projected/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-kube-api-access-w76pn\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.412864 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.412831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:34.415239 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.415208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9187cac0-ce5c-42b4-93e0-7efd963a5632-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vsgb5\" (UID: \"9187cac0-ce5c-42b4-93e0-7efd963a5632\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:34.513339 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.513314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-tls\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.515466 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.515448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7-node-exporter-tls\") pod \"node-exporter-f77r8\" (UID: \"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7\") " pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.674082 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.674007 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" Apr 22 20:00:34.705489 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.705462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f77r8" Apr 22 20:00:34.713456 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:34.713426 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4eda699_e8d5_47b6_b1a3_62d6fc0c3fc7.slice/crio-8c9d7e5ff4fc5769741cdeaa02938cbfbd294f9d9d6a5d05bc4cfebfc9075a0a WatchSource:0}: Error finding container 8c9d7e5ff4fc5769741cdeaa02938cbfbd294f9d9d6a5d05bc4cfebfc9075a0a: Status 404 returned error can't find the container with id 8c9d7e5ff4fc5769741cdeaa02938cbfbd294f9d9d6a5d05bc4cfebfc9075a0a Apr 22 20:00:34.729254 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.729202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f77r8" event={"ID":"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7","Type":"ContainerStarted","Data":"8c9d7e5ff4fc5769741cdeaa02938cbfbd294f9d9d6a5d05bc4cfebfc9075a0a"} Apr 22 20:00:34.792440 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:34.792383 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5"] Apr 22 20:00:34.797431 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:34.797381 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9187cac0_ce5c_42b4_93e0_7efd963a5632.slice/crio-27a219fdfeb78e9933a56e220ce341821e0b590dc37acf6ca99967c7422bbe82 WatchSource:0}: Error finding container 27a219fdfeb78e9933a56e220ce341821e0b590dc37acf6ca99967c7422bbe82: Status 404 returned error can't find the container with id 27a219fdfeb78e9933a56e220ce341821e0b590dc37acf6ca99967c7422bbe82 Apr 22 20:00:35.733834 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.733797 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7" containerID="5a47f6ec496395cc2a562a941baf9693a41e82d21a5409e26b280f98351cf766" exitCode=0 Apr 22 20:00:35.734033 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.733890 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f77r8" event={"ID":"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7","Type":"ContainerDied","Data":"5a47f6ec496395cc2a562a941baf9693a41e82d21a5409e26b280f98351cf766"} Apr 22 20:00:35.735701 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.735668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" event={"ID":"9187cac0-ce5c-42b4-93e0-7efd963a5632","Type":"ContainerStarted","Data":"44bec9e17b3c126ce6f8754785bf0cf5ee388a46a8e2a7953a743f110b6eeb52"} Apr 22 20:00:35.735701 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.735699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" event={"ID":"9187cac0-ce5c-42b4-93e0-7efd963a5632","Type":"ContainerStarted","Data":"28a13e670e40e0a4a41d235ec489aad514d9ae91cb22d4f8cd0ec5e68f32016f"} Apr 22 20:00:35.735865 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.735714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" event={"ID":"9187cac0-ce5c-42b4-93e0-7efd963a5632","Type":"ContainerStarted","Data":"27a219fdfeb78e9933a56e220ce341821e0b590dc37acf6ca99967c7422bbe82"} Apr 22 20:00:35.787717 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.787686 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz"] Apr 22 20:00:35.790251 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.790224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.793291 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793140 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 20:00:35.793291 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793156 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bjl2j1tfg6g80\"" Apr 22 20:00:35.793291 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 20:00:35.793291 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793240 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 20:00:35.793291 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793257 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-x9dw9\"" Apr 22 20:00:35.793825 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 20:00:35.793825 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.793541 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 20:00:35.802034 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.802014 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz"] Apr 22 20:00:35.923881 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.923859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-tls\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924015 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.923889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3367fec7-9706-4628-83b7-af7a1e3cc219-metrics-client-ca\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924087 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.924003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924087 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.924050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924193 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.924097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924193 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.924146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxxg\" (UniqueName: \"kubernetes.io/projected/3367fec7-9706-4628-83b7-af7a1e3cc219-kube-api-access-6cxxg\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924265 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.924229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-grpc-tls\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:35.924295 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:35.924265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025463 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025463 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025463 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025713 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxxg\" (UniqueName: \"kubernetes.io/projected/3367fec7-9706-4628-83b7-af7a1e3cc219-kube-api-access-6cxxg\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025713 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-grpc-tls\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025713 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025713 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-tls\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.025713 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.025597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3367fec7-9706-4628-83b7-af7a1e3cc219-metrics-client-ca\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028116 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028259 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028464 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028464 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3367fec7-9706-4628-83b7-af7a1e3cc219-metrics-client-ca\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028610 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-tls\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028716 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-grpc-tls\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.028864 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.028845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3367fec7-9706-4628-83b7-af7a1e3cc219-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.033241 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.033224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxxg\" (UniqueName: \"kubernetes.io/projected/3367fec7-9706-4628-83b7-af7a1e3cc219-kube-api-access-6cxxg\") pod \"thanos-querier-5bcc7bcbdf-dx4zz\" (UID: \"3367fec7-9706-4628-83b7-af7a1e3cc219\") " pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.101788 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.101768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:36.219124 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.219093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz"] Apr 22 20:00:36.222291 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:36.222265 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3367fec7_9706_4628_83b7_af7a1e3cc219.slice/crio-1e04437498b6f84e28d3a6069b806ad9e5211001c4ca06825369923e963ebe3e WatchSource:0}: Error finding container 1e04437498b6f84e28d3a6069b806ad9e5211001c4ca06825369923e963ebe3e: Status 404 returned error can't find the container with id 1e04437498b6f84e28d3a6069b806ad9e5211001c4ca06825369923e963ebe3e Apr 22 20:00:36.740803 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.740757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" event={"ID":"9187cac0-ce5c-42b4-93e0-7efd963a5632","Type":"ContainerStarted","Data":"d676a4a17336cb8ea0b591c3c710b9c2b906fd4031875bd9808d3a480014580a"} Apr 22 20:00:36.742198 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.742166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"1e04437498b6f84e28d3a6069b806ad9e5211001c4ca06825369923e963ebe3e"} Apr 22 20:00:36.744398 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.744360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f77r8" event={"ID":"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7","Type":"ContainerStarted","Data":"8a52f2f24c311d101a5b146021e1c6c5f9834390421b03020797f982ca9ba9b9"} Apr 22 20:00:36.744398 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.744392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f77r8" event={"ID":"f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7","Type":"ContainerStarted","Data":"d3a6312d60348eefac8ccfbe6104eda09381b91eb1ef072bf42c4489a5f0db16"} Apr 22 20:00:36.758617 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.758569 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vsgb5" podStartSLOduration=2.886713503 podStartE2EDuration="3.758552507s" podCreationTimestamp="2026-04-22 20:00:33 +0000 UTC" firstStartedPulling="2026-04-22 20:00:34.977502855 +0000 UTC m=+114.174823719" lastFinishedPulling="2026-04-22 20:00:35.849341854 +0000 UTC m=+115.046662723" observedRunningTime="2026-04-22 20:00:36.75849927 +0000 UTC m=+115.955820153" watchObservedRunningTime="2026-04-22 20:00:36.758552507 +0000 UTC m=+115.955873391" Apr 22 20:00:36.780575 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:36.780528 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f77r8" podStartSLOduration=3.179693552 podStartE2EDuration="3.780514121s" podCreationTimestamp="2026-04-22 20:00:33 +0000 UTC" firstStartedPulling="2026-04-22 20:00:34.715190339 +0000 UTC m=+113.912511201" lastFinishedPulling="2026-04-22 20:00:35.316010905 +0000 UTC m=+114.513331770" observedRunningTime="2026-04-22 20:00:36.779723212 +0000 UTC m=+115.977044097" watchObservedRunningTime="2026-04-22 20:00:36.780514121 +0000 UTC m=+115.977835006" Apr 22 20:00:38.072270 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.072242 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c5d448c7f-h5wxd"] Apr 22 20:00:38.074141 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.074121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.079020 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.078989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jtwns\"" Apr 22 20:00:38.079020 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.079007 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 20:00:38.079020 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.079001 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-btht3sahas5mu\"" Apr 22 20:00:38.079215 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.078997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 20:00:38.079312 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.079296 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 20:00:38.079380 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.079311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 20:00:38.085501 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.085482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c5d448c7f-h5wxd"] Apr 22 20:00:38.144251 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-secret-metrics-server-tls\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.144251 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f424c575-8486-4afb-9b1a-71a9d4777588-metrics-server-audit-profiles\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.144388 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f424c575-8486-4afb-9b1a-71a9d4777588-audit-log\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.144388 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144305 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-secret-metrics-server-client-certs\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.144388 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f424c575-8486-4afb-9b1a-71a9d4777588-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.144388 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcd5j\" (UniqueName: \"kubernetes.io/projected/f424c575-8486-4afb-9b1a-71a9d4777588-kube-api-access-bcd5j\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.144531 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.144405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-client-ca-bundle\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.244938 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.244902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f424c575-8486-4afb-9b1a-71a9d4777588-metrics-server-audit-profiles\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245065 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.244973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f424c575-8486-4afb-9b1a-71a9d4777588-audit-log\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245065 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-secret-metrics-server-client-certs\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245065 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f424c575-8486-4afb-9b1a-71a9d4777588-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245065 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcd5j\" (UniqueName: \"kubernetes.io/projected/f424c575-8486-4afb-9b1a-71a9d4777588-kube-api-access-bcd5j\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245335 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-client-ca-bundle\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245335 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-secret-metrics-server-tls\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245443 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f424c575-8486-4afb-9b1a-71a9d4777588-audit-log\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f424c575-8486-4afb-9b1a-71a9d4777588-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.245965 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.245947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f424c575-8486-4afb-9b1a-71a9d4777588-metrics-server-audit-profiles\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.247602 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.247577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-secret-metrics-server-client-certs\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.247702 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.247630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-secret-metrics-server-tls\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.247955 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.247935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424c575-8486-4afb-9b1a-71a9d4777588-client-ca-bundle\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.252424 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.252406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcd5j\" (UniqueName: \"kubernetes.io/projected/f424c575-8486-4afb-9b1a-71a9d4777588-kube-api-access-bcd5j\") pod \"metrics-server-7c5d448c7f-h5wxd\" (UID: \"f424c575-8486-4afb-9b1a-71a9d4777588\") " pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.383361 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.383333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:38.517812 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.517764 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c5d448c7f-h5wxd"] Apr 22 20:00:38.560993 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.560964 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt"] Apr 22 20:00:38.563469 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.563453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:38.566135 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.566113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 20:00:38.566228 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.566168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-x942b\"" Apr 22 20:00:38.571654 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.571632 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt"] Apr 22 20:00:38.617078 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:38.617045 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf424c575_8486_4afb_9b1a_71a9d4777588.slice/crio-97938d1fb3cacf6b7cf74ea8414cfdfc681f8978c477c9427659e909dcfe8338 WatchSource:0}: Error finding container 97938d1fb3cacf6b7cf74ea8414cfdfc681f8978c477c9427659e909dcfe8338: Status 404 returned error can't find the container with id 97938d1fb3cacf6b7cf74ea8414cfdfc681f8978c477c9427659e909dcfe8338 Apr 22 20:00:38.648106 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.648082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ea8a3d9c-c37d-4843-a542-71f8f152e29c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h72kt\" (UID: \"ea8a3d9c-c37d-4843-a542-71f8f152e29c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:38.748640 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.748611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ea8a3d9c-c37d-4843-a542-71f8f152e29c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h72kt\" (UID: \"ea8a3d9c-c37d-4843-a542-71f8f152e29c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:38.750805 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.750784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ea8a3d9c-c37d-4843-a542-71f8f152e29c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h72kt\" (UID: \"ea8a3d9c-c37d-4843-a542-71f8f152e29c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:38.751695 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.751653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" event={"ID":"f424c575-8486-4afb-9b1a-71a9d4777588","Type":"ContainerStarted","Data":"97938d1fb3cacf6b7cf74ea8414cfdfc681f8978c477c9427659e909dcfe8338"} Apr 22 20:00:38.754021 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.754000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"cc0be332657f896404d1bdd9b9bf2fc6abc5a5aa75c2b646ed0da6136b1de763"} Apr 22 20:00:38.754101 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.754031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"b963be350beae33148716d3ff506c72efbb79afe16cdc4b0605e7e15dd76052b"} Apr 22 20:00:38.754101 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.754044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"0017d6a907b72fc24d1548a0049177d15664a4e87df85df315d6f00e19608ab1"} Apr 22 20:00:38.754101 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.754057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"a209762cb9734d0498b0dcdbb4782121fbb22ed2e2d28d166dd71e24e5c260bb"} Apr 22 20:00:38.877060 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.877025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:38.985461 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:38.985389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt"] Apr 22 20:00:38.987416 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:38.987390 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8a3d9c_c37d_4843_a542_71f8f152e29c.slice/crio-6eccb03d02a7c245dcd4cd83d99d12973eaf6337893fc5a8c5a9352133d4e81b WatchSource:0}: Error finding container 6eccb03d02a7c245dcd4cd83d99d12973eaf6337893fc5a8c5a9352133d4e81b: Status 404 returned error can't find the container with id 6eccb03d02a7c245dcd4cd83d99d12973eaf6337893fc5a8c5a9352133d4e81b Apr 22 20:00:39.769902 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:39.769848 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"142881e0ca02f4dc1a16c4eb91db1317f1596818864162c7a08ebb62f11c528f"} Apr 22 20:00:39.770383 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:39.769907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" event={"ID":"3367fec7-9706-4628-83b7-af7a1e3cc219","Type":"ContainerStarted","Data":"be49078012bebca85b9ea7dc32d862ff75a9acda5dd662ffc7e3c91f9d99b139"} Apr 22 20:00:39.770383 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:39.770095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:39.771465 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:39.771432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" event={"ID":"ea8a3d9c-c37d-4843-a542-71f8f152e29c","Type":"ContainerStarted","Data":"6eccb03d02a7c245dcd4cd83d99d12973eaf6337893fc5a8c5a9352133d4e81b"} Apr 22 20:00:39.792701 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:39.792655 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" podStartSLOduration=2.379351205 podStartE2EDuration="4.792637768s" podCreationTimestamp="2026-04-22 20:00:35 +0000 UTC" firstStartedPulling="2026-04-22 20:00:36.224313768 +0000 UTC m=+115.421634632" lastFinishedPulling="2026-04-22 20:00:38.637600331 +0000 UTC m=+117.834921195" observedRunningTime="2026-04-22 20:00:39.791125199 +0000 UTC m=+118.988446084" watchObservedRunningTime="2026-04-22 20:00:39.792637768 +0000 UTC m=+118.989958659" Apr 22 20:00:40.776402 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:40.776354 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" event={"ID":"f424c575-8486-4afb-9b1a-71a9d4777588","Type":"ContainerStarted","Data":"5c8572bebfd646b51d5f96eae7ca800712048ecab91a0b2c44b0d44ffde4ce9d"} Apr 22 20:00:40.778210 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:40.778156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" event={"ID":"ea8a3d9c-c37d-4843-a542-71f8f152e29c","Type":"ContainerStarted","Data":"689de198cd538846fccade31734d78fce5c1aad6eeed0c0d134e59e6e9f10542"} Apr 22 20:00:40.778484 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:40.778369 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:40.783569 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:40.783550 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" Apr 22 20:00:40.794796 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:40.794753 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" podStartSLOduration=1.090134295 podStartE2EDuration="2.794742834s" podCreationTimestamp="2026-04-22 20:00:38 +0000 UTC" firstStartedPulling="2026-04-22 20:00:38.618940392 +0000 UTC m=+117.816261255" lastFinishedPulling="2026-04-22 20:00:40.323548929 +0000 UTC m=+119.520869794" observedRunningTime="2026-04-22 20:00:40.792503646 +0000 UTC m=+119.989824529" watchObservedRunningTime="2026-04-22 20:00:40.794742834 +0000 UTC m=+119.992063717" Apr 22 20:00:40.806177 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:40.806138 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h72kt" podStartSLOduration=1.46983901 podStartE2EDuration="2.806128466s" podCreationTimestamp="2026-04-22 20:00:38 +0000 UTC" firstStartedPulling="2026-04-22 20:00:38.989299134 +0000 UTC m=+118.186619998" lastFinishedPulling="2026-04-22 20:00:40.325588579 +0000 UTC m=+119.522909454" observedRunningTime="2026-04-22 20:00:40.80581013 +0000 UTC m=+120.003131012" watchObservedRunningTime="2026-04-22 20:00:40.806128466 +0000 UTC m=+120.003449348" Apr 22 20:00:41.332028 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:41.332003 2575 scope.go:117] "RemoveContainer" containerID="1d578c39360e2e7db9b028675452b033fe4aad1f01280c8492bf850caa567866" Apr 22 20:00:41.332204 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:00:41.332176 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-4v6ps_openshift-console-operator(d677224f-4cd1-4503-9169-7d8613c67e70)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podUID="d677224f-4cd1-4503-9169-7d8613c67e70" Apr 22 20:00:45.783958 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:45.783906 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5bcc7bcbdf-dx4zz" Apr 22 20:00:51.048318 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:51.048286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 20:00:51.050504 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:51.050483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c3e7957-917d-44e3-8833-76ccc4a5d167-metrics-certs\") pod \"network-metrics-daemon-rw25k\" (UID: \"7c3e7957-917d-44e3-8833-76ccc4a5d167\") " pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 20:00:51.343106 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:51.343036 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-79t2x\"" Apr 22 20:00:51.350466 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:51.350445 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rw25k" Apr 22 20:00:51.466219 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:51.466191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rw25k"] Apr 22 20:00:51.469023 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:00:51.468999 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3e7957_917d_44e3_8833_76ccc4a5d167.slice/crio-7e99bfc60b6d11d94260a480f792b28bc8bb487dc1e2925694ce6c079d2bf153 WatchSource:0}: Error finding container 7e99bfc60b6d11d94260a480f792b28bc8bb487dc1e2925694ce6c079d2bf153: Status 404 returned error can't find the container with id 7e99bfc60b6d11d94260a480f792b28bc8bb487dc1e2925694ce6c079d2bf153 Apr 22 20:00:51.813180 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:51.813143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rw25k" event={"ID":"7c3e7957-917d-44e3-8833-76ccc4a5d167","Type":"ContainerStarted","Data":"7e99bfc60b6d11d94260a480f792b28bc8bb487dc1e2925694ce6c079d2bf153"} Apr 22 20:00:52.330492 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.330462 2575 scope.go:117] "RemoveContainer" containerID="1d578c39360e2e7db9b028675452b033fe4aad1f01280c8492bf850caa567866" Apr 22 20:00:52.817614 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.817584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:00:52.817779 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.817682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" event={"ID":"d677224f-4cd1-4503-9169-7d8613c67e70","Type":"ContainerStarted","Data":"48841f9ab37edbd4f204364bc5249988c4c628791e4ef9d9e9a7c0bd7af11d2e"} Apr 22 20:00:52.818230 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.818071 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:52.820124 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.819596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rw25k" event={"ID":"7c3e7957-917d-44e3-8833-76ccc4a5d167","Type":"ContainerStarted","Data":"ca18135d8757e9e3b7c78ca154ac9ef004d54336e4b8186ce05e45e49c7f188e"} Apr 22 20:00:52.820124 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.819628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rw25k" event={"ID":"7c3e7957-917d-44e3-8833-76ccc4a5d167","Type":"ContainerStarted","Data":"463724790c03f05c3f3981a79e53608de112020552aa1bf7feb59d51c8f537e2"} Apr 22 20:00:52.835749 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.835674 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" podStartSLOduration=51.050621012 podStartE2EDuration="52.835662795s" podCreationTimestamp="2026-04-22 20:00:00 +0000 UTC" firstStartedPulling="2026-04-22 20:00:01.109358448 +0000 UTC m=+80.306679324" lastFinishedPulling="2026-04-22 20:00:02.894400021 +0000 UTC m=+82.091721107" observedRunningTime="2026-04-22 20:00:52.834088757 +0000 UTC m=+132.031409641" watchObservedRunningTime="2026-04-22 20:00:52.835662795 +0000 UTC m=+132.032983677" Apr 22 20:00:52.849822 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:52.849783 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rw25k" podStartSLOduration=130.955641014 podStartE2EDuration="2m11.849770872s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 20:00:51.470975357 +0000 UTC m=+130.668296224" lastFinishedPulling="2026-04-22 20:00:52.365105205 +0000 UTC m=+131.562426082" observedRunningTime="2026-04-22 20:00:52.847987683 +0000 UTC m=+132.045308566" watchObservedRunningTime="2026-04-22 20:00:52.849770872 +0000 UTC m=+132.047091754" Apr 22 20:00:53.711675 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:53.711642 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4v6ps" Apr 22 20:00:58.384087 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:58.384058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:00:58.384087 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:00:58.384091 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:01:17.489758 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:17.489726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-d2pwr_8dfd4ef7-39de-4af8-be99-d4e588c62cf8/cluster-monitoring-operator/0.log" Apr 22 20:01:18.287728 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:18.287699 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c5d448c7f-h5wxd_f424c575-8486-4afb-9b1a-71a9d4777588/metrics-server/0.log" Apr 22 20:01:18.389444 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:18.389410 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:01:18.393366 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:18.393340 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c5d448c7f-h5wxd" Apr 22 20:01:18.486880 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:18.486851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-h72kt_ea8a3d9c-c37d-4843-a542-71f8f152e29c/monitoring-plugin/0.log" Apr 22 20:01:18.686773 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:18.686751 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f77r8_f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7/init-textfile/0.log" Apr 22 20:01:18.888439 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:18.888402 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f77r8_f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7/node-exporter/0.log" Apr 22 20:01:19.090291 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:19.090218 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f77r8_f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7/kube-rbac-proxy/0.log" Apr 22 20:01:20.487264 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:20.487233 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vsgb5_9187cac0-ce5c-42b4-93e0-7efd963a5632/kube-rbac-proxy-main/0.log" Apr 22 20:01:20.687165 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:20.687137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vsgb5_9187cac0-ce5c-42b4-93e0-7efd963a5632/kube-rbac-proxy-self/0.log" Apr 22 20:01:20.886679 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:20.886607 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vsgb5_9187cac0-ce5c-42b4-93e0-7efd963a5632/openshift-state-metrics/0.log" Apr 22 20:01:22.489615 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:22.489573 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dhg8l_5ddc495d-0284-42b1-b944-aa586284e1fa/prometheus-operator/0.log" Apr 22 20:01:22.686911 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:22.686876 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dhg8l_5ddc495d-0284-42b1-b944-aa586284e1fa/kube-rbac-proxy/0.log" Apr 22 20:01:23.087858 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:23.087834 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/thanos-query/0.log" Apr 22 20:01:23.287377 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:23.287352 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy-web/0.log" Apr 22 20:01:23.487108 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:23.487086 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy/0.log" Apr 22 20:01:23.698432 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:23.698403 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/prom-label-proxy/0.log" Apr 22 20:01:23.886845 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:23.886761 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy-rules/0.log" Apr 22 20:01:24.087856 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:24.087788 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy-metrics/0.log" Apr 22 20:01:24.487009 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:24.486975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:01:24.689748 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:24.689712 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/3.log" Apr 22 20:01:25.886876 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:25.886847 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dvs95_ddc58e3d-3248-48a6-990e-23900cf6ae7f/dns/0.log" Apr 22 20:01:26.088379 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:26.088351 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dvs95_ddc58e3d-3248-48a6-990e-23900cf6ae7f/kube-rbac-proxy/0.log" Apr 22 20:01:26.687201 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:01:26.687175 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hrntb_7047d9ee-3de6-4d53-a8d2-f3e0ea19b774/dns-node-resolver/0.log" Apr 22 20:03:41.236596 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:41.236571 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:03:41.237076 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:41.236756 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:03:41.242487 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:41.242462 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:03:41.242994 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:41.242976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:03:57.500432 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.500399 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kpxqh"] Apr 22 20:03:57.503425 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.503406 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.506016 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.505990 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 20:03:57.510563 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.510536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kpxqh"] Apr 22 20:03:57.631881 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.631853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-dbus\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.632021 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.631890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-kubelet-config\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.632021 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.631957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-original-pull-secret\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.733301 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.733276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-original-pull-secret\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.733418 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.733324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-dbus\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.733418 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.733350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-kubelet-config\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.733494 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.733421 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-kubelet-config\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.733614 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.733595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-dbus\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.735418 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.735399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3-original-pull-secret\") pod \"global-pull-secret-syncer-kpxqh\" (UID: \"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3\") " pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.812679 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.812629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpxqh" Apr 22 20:03:57.923965 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.923889 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kpxqh"] Apr 22 20:03:57.926146 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:03:57.926119 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c9c0b5_cfb0_4848_beb9_0dd2fcb635d3.slice/crio-6d6093c1258cee87a73eb0da78e92de4e2627e795d8c5ce539eca54bd034e378 WatchSource:0}: Error finding container 6d6093c1258cee87a73eb0da78e92de4e2627e795d8c5ce539eca54bd034e378: Status 404 returned error can't find the container with id 6d6093c1258cee87a73eb0da78e92de4e2627e795d8c5ce539eca54bd034e378 Apr 22 20:03:57.927728 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:57.927711 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:03:58.317414 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:03:58.317378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kpxqh" event={"ID":"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3","Type":"ContainerStarted","Data":"6d6093c1258cee87a73eb0da78e92de4e2627e795d8c5ce539eca54bd034e378"} Apr 22 20:04:02.330097 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:02.330015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kpxqh" event={"ID":"d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3","Type":"ContainerStarted","Data":"6490672e1bd09533238201e903baf46c49400a24f5f45bde123deb2eff10c677"} Apr 22 20:04:02.345366 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:02.345323 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kpxqh" podStartSLOduration=1.360121545 podStartE2EDuration="5.345307609s" podCreationTimestamp="2026-04-22 20:03:57 +0000 UTC" firstStartedPulling="2026-04-22 20:03:57.927832046 +0000 UTC m=+317.125152910" lastFinishedPulling="2026-04-22 20:04:01.913018109 +0000 UTC m=+321.110338974" observedRunningTime="2026-04-22 20:04:02.344662767 +0000 UTC m=+321.541983650" watchObservedRunningTime="2026-04-22 20:04:02.345307609 +0000 UTC m=+321.542628495" Apr 22 20:04:16.197361 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.197326 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4"] Apr 22 20:04:16.202042 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.202027 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.204965 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.204935 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 20:04:16.205081 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.204977 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 20:04:16.205081 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.205010 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 20:04:16.206638 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.206617 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 20:04:16.208237 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.208216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4"] Apr 22 20:04:16.214340 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.214319 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w"] Apr 22 20:04:16.218447 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.218427 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.223044 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.223012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 20:04:16.223044 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.223029 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 20:04:16.223044 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.223037 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 20:04:16.223247 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.223012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 20:04:16.227169 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.227150 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w"] Apr 22 20:04:16.363092 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-hub\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.363092 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.363290 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6j8\" (UniqueName: \"kubernetes.io/projected/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-kube-api-access-5x6j8\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.363290 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfk5\" (UniqueName: \"kubernetes.io/projected/e1656ae7-0304-490d-8cb7-dd13e1a86c21-kube-api-access-bzfk5\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.363290 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-ca\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.363290 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.363290 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e1656ae7-0304-490d-8cb7-dd13e1a86c21-klusterlet-config\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.363503 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1656ae7-0304-490d-8cb7-dd13e1a86c21-tmp\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.363503 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.363330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.464639 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfk5\" (UniqueName: \"kubernetes.io/projected/e1656ae7-0304-490d-8cb7-dd13e1a86c21-kube-api-access-bzfk5\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.464639 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-ca\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.464790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.464790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e1656ae7-0304-490d-8cb7-dd13e1a86c21-klusterlet-config\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.464790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1656ae7-0304-490d-8cb7-dd13e1a86c21-tmp\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.464790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.464790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-hub\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.464790 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.465105 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.464909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6j8\" (UniqueName: \"kubernetes.io/projected/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-kube-api-access-5x6j8\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.465169 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.465147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1656ae7-0304-490d-8cb7-dd13e1a86c21-tmp\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.465616 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.465589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.467506 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.467469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-ca\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.467506 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.467479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.467674 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.467576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-hub\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.467733 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.467678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e1656ae7-0304-490d-8cb7-dd13e1a86c21-klusterlet-config\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.467823 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.467800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.474467 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.474448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfk5\" (UniqueName: \"kubernetes.io/projected/e1656ae7-0304-490d-8cb7-dd13e1a86c21-kube-api-access-bzfk5\") pod \"klusterlet-addon-workmgr-7db6dbf898-rxxk4\" (UID: \"e1656ae7-0304-490d-8cb7-dd13e1a86c21\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.474592 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.474574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6j8\" (UniqueName: \"kubernetes.io/projected/0fa3a7b7-21ef-4e70-9f49-d1fe9670593d-kube-api-access-5x6j8\") pod \"cluster-proxy-proxy-agent-59bbb888d8-nh55w\" (UID: \"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.511982 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.511962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:16.535639 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.535617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" Apr 22 20:04:16.640577 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.640545 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4"] Apr 22 20:04:16.644060 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:04:16.644026 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1656ae7_0304_490d_8cb7_dd13e1a86c21.slice/crio-cf16055f78400c142814eab3b5a99dd66fc948a74d81ff3ba7e5577863d14391 WatchSource:0}: Error finding container cf16055f78400c142814eab3b5a99dd66fc948a74d81ff3ba7e5577863d14391: Status 404 returned error can't find the container with id cf16055f78400c142814eab3b5a99dd66fc948a74d81ff3ba7e5577863d14391 Apr 22 20:04:16.662439 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:16.662420 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w"] Apr 22 20:04:16.664453 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:04:16.664426 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fa3a7b7_21ef_4e70_9f49_d1fe9670593d.slice/crio-22bc10750a61dc3875d4a9be5b96ba41b00467f4153540a34c7f349b233485d0 WatchSource:0}: Error finding container 22bc10750a61dc3875d4a9be5b96ba41b00467f4153540a34c7f349b233485d0: Status 404 returned error can't find the container with id 22bc10750a61dc3875d4a9be5b96ba41b00467f4153540a34c7f349b233485d0 Apr 22 20:04:17.368152 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:17.368111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" event={"ID":"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d","Type":"ContainerStarted","Data":"22bc10750a61dc3875d4a9be5b96ba41b00467f4153540a34c7f349b233485d0"} Apr 22 20:04:17.369015 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:17.368994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" event={"ID":"e1656ae7-0304-490d-8cb7-dd13e1a86c21","Type":"ContainerStarted","Data":"cf16055f78400c142814eab3b5a99dd66fc948a74d81ff3ba7e5577863d14391"} Apr 22 20:04:21.386385 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:21.386350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" event={"ID":"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d","Type":"ContainerStarted","Data":"c435943545014fb81b4741567526cf774a9bc31dcec6b82ef5bfca43262053ad"} Apr 22 20:04:21.387537 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:21.387510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" event={"ID":"e1656ae7-0304-490d-8cb7-dd13e1a86c21","Type":"ContainerStarted","Data":"2639ed19fa980b70e5774c4e081649042f81ac9d45c72ef0392dd762a6f06fbe"} Apr 22 20:04:21.387719 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:21.387701 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:21.389341 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:21.389322 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" Apr 22 20:04:21.404244 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:21.404175 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db6dbf898-rxxk4" podStartSLOduration=0.898624579 podStartE2EDuration="5.404160444s" podCreationTimestamp="2026-04-22 20:04:16 +0000 UTC" firstStartedPulling="2026-04-22 20:04:16.645860025 +0000 UTC m=+335.843180889" lastFinishedPulling="2026-04-22 20:04:21.151395879 +0000 UTC m=+340.348716754" observedRunningTime="2026-04-22 20:04:21.403618448 +0000 UTC m=+340.600939330" watchObservedRunningTime="2026-04-22 20:04:21.404160444 +0000 UTC m=+340.601481322" Apr 22 20:04:23.394230 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:23.394144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" event={"ID":"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d","Type":"ContainerStarted","Data":"e026c24da1d2c0650b5fd04112a51cee46552f091b2457c5101c19e6ee94eb59"} Apr 22 20:04:23.394230 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:23.394182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" event={"ID":"0fa3a7b7-21ef-4e70-9f49-d1fe9670593d","Type":"ContainerStarted","Data":"8a72690e7fcdc869e38df94cc1474ecb59e077be38fd6c201a0fee8f041683b0"} Apr 22 20:04:23.413607 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:04:23.413559 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59bbb888d8-nh55w" podStartSLOduration=1.171708603 podStartE2EDuration="7.413542921s" podCreationTimestamp="2026-04-22 20:04:16 +0000 UTC" firstStartedPulling="2026-04-22 20:04:16.666053159 +0000 UTC m=+335.863374024" lastFinishedPulling="2026-04-22 20:04:22.907887471 +0000 UTC m=+342.105208342" observedRunningTime="2026-04-22 20:04:23.411444778 +0000 UTC m=+342.608765663" watchObservedRunningTime="2026-04-22 20:04:23.413542921 +0000 UTC m=+342.610863805" Apr 22 20:07:32.971268 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:32.971189 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg"] Apr 22 20:07:32.974608 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:32.974585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:07:32.977347 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:32.977325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4jdsb\"" Apr 22 20:07:32.982466 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:32.982280 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg"] Apr 22 20:07:33.047942 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:33.047898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2daa5f1-6dc2-44eb-82f9-cef97d519395-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-2sbxg\" (UID: \"a2daa5f1-6dc2-44eb-82f9-cef97d519395\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:07:33.149001 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:33.148972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2daa5f1-6dc2-44eb-82f9-cef97d519395-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-2sbxg\" (UID: \"a2daa5f1-6dc2-44eb-82f9-cef97d519395\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:07:33.149354 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:33.149331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2daa5f1-6dc2-44eb-82f9-cef97d519395-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-2sbxg\" (UID: \"a2daa5f1-6dc2-44eb-82f9-cef97d519395\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:07:33.286343 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:33.286282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:07:33.400728 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:33.400691 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg"] Apr 22 20:07:33.405874 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:07:33.405840 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2daa5f1_6dc2_44eb_82f9_cef97d519395.slice/crio-5bcf7e48f7bd8f2d1b9f311535adb2a3dd5fe6448277f12e152ac3198beb5a9f WatchSource:0}: Error finding container 5bcf7e48f7bd8f2d1b9f311535adb2a3dd5fe6448277f12e152ac3198beb5a9f: Status 404 returned error can't find the container with id 5bcf7e48f7bd8f2d1b9f311535adb2a3dd5fe6448277f12e152ac3198beb5a9f Apr 22 20:07:33.914220 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:33.914176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" event={"ID":"a2daa5f1-6dc2-44eb-82f9-cef97d519395","Type":"ContainerStarted","Data":"5bcf7e48f7bd8f2d1b9f311535adb2a3dd5fe6448277f12e152ac3198beb5a9f"} Apr 22 20:07:37.928754 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:37.928715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" event={"ID":"a2daa5f1-6dc2-44eb-82f9-cef97d519395","Type":"ContainerStarted","Data":"f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14"} Apr 22 20:07:40.937763 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:40.937728 2575 generic.go:358] "Generic (PLEG): container finished" podID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerID="f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14" exitCode=0 Apr 22 20:07:40.938128 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:07:40.937768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" event={"ID":"a2daa5f1-6dc2-44eb-82f9-cef97d519395","Type":"ContainerDied","Data":"f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14"} Apr 22 20:08:02.002783 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:02.002751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" event={"ID":"a2daa5f1-6dc2-44eb-82f9-cef97d519395","Type":"ContainerStarted","Data":"f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3"} Apr 22 20:08:02.003234 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:02.003070 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:08:02.004535 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:02.004483 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:08:02.019444 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:02.019394 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podStartSLOduration=2.169837625 podStartE2EDuration="30.019382988s" podCreationTimestamp="2026-04-22 20:07:32 +0000 UTC" firstStartedPulling="2026-04-22 20:07:33.40760376 +0000 UTC m=+532.604924625" lastFinishedPulling="2026-04-22 20:08:01.257149128 +0000 UTC m=+560.454469988" observedRunningTime="2026-04-22 20:08:02.019156798 +0000 UTC m=+561.216477682" watchObservedRunningTime="2026-04-22 20:08:02.019382988 +0000 UTC m=+561.216703870" Apr 22 20:08:03.005954 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:03.005892 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:08:13.006184 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:13.006143 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:08:23.006502 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:23.006461 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:08:33.006573 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:33.006527 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:08:41.257557 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:41.257531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:08:41.258036 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:41.257632 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:08:41.263486 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:41.263464 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:08:41.263662 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:41.263645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:08:43.006885 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:43.006841 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:08:53.006178 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:08:53.006135 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:09:03.006679 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:03.006651 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:09:42.921890 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:42.921857 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg"] Apr 22 20:09:42.922529 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:42.922250 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" containerID="cri-o://f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3" gracePeriod=30 Apr 22 20:09:43.006419 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:43.006379 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 20:09:45.961581 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:45.961555 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:09:46.093164 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.093091 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2daa5f1-6dc2-44eb-82f9-cef97d519395-kserve-provision-location\") pod \"a2daa5f1-6dc2-44eb-82f9-cef97d519395\" (UID: \"a2daa5f1-6dc2-44eb-82f9-cef97d519395\") " Apr 22 20:09:46.093383 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.093361 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2daa5f1-6dc2-44eb-82f9-cef97d519395-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2daa5f1-6dc2-44eb-82f9-cef97d519395" (UID: "a2daa5f1-6dc2-44eb-82f9-cef97d519395"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:46.194184 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.194155 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2daa5f1-6dc2-44eb-82f9-cef97d519395-kserve-provision-location\") on node \"ip-10-0-128-160.ec2.internal\" DevicePath \"\"" Apr 22 20:09:46.299741 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.299712 2575 generic.go:358] "Generic (PLEG): container finished" podID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerID="f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3" exitCode=0 Apr 22 20:09:46.299837 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.299788 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" Apr 22 20:09:46.299837 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.299799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" event={"ID":"a2daa5f1-6dc2-44eb-82f9-cef97d519395","Type":"ContainerDied","Data":"f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3"} Apr 22 20:09:46.299904 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.299837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg" event={"ID":"a2daa5f1-6dc2-44eb-82f9-cef97d519395","Type":"ContainerDied","Data":"5bcf7e48f7bd8f2d1b9f311535adb2a3dd5fe6448277f12e152ac3198beb5a9f"} Apr 22 20:09:46.299904 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.299853 2575 scope.go:117] "RemoveContainer" containerID="f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3" Apr 22 20:09:46.308126 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.308111 2575 scope.go:117] "RemoveContainer" containerID="f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14" Apr 22 20:09:46.314965 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.314948 2575 scope.go:117] "RemoveContainer" containerID="f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3" Apr 22 20:09:46.315209 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:09:46.315191 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3\": container with ID starting with f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3 not found: ID does not exist" containerID="f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3" Apr 22 20:09:46.315263 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.315217 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3"} err="failed to get container status \"f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3\": rpc error: code = NotFound desc = could not find container \"f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3\": container with ID starting with f84d14b202b08d914bea7eeccb57b82e638d10c50cf581ee83dac79a445e9bc3 not found: ID does not exist" Apr 22 20:09:46.315263 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.315245 2575 scope.go:117] "RemoveContainer" containerID="f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14" Apr 22 20:09:46.315482 ip-10-0-128-160 kubenswrapper[2575]: E0422 20:09:46.315465 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14\": container with ID starting with f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14 not found: ID does not exist" containerID="f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14" Apr 22 20:09:46.315525 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.315487 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14"} err="failed to get container status \"f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14\": rpc error: code = NotFound desc = could not find container \"f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14\": container with ID starting with f3625a891fdb2729bf3b8ed485cef10561fdb8c9095414bfb57f5b2e918b9a14 not found: ID does not exist" Apr 22 20:09:46.323432 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.323407 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg"] Apr 22 20:09:46.326466 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:46.326448 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-2sbxg"] Apr 22 20:09:47.333736 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:09:47.333704 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" path="/var/lib/kubelet/pods/a2daa5f1-6dc2-44eb-82f9-cef97d519395/volumes" Apr 22 20:13:41.277373 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:13:41.277295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:13:41.278649 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:13:41.278626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:13:41.283874 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:13:41.283854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:13:41.284765 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:13:41.284746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:18:41.301810 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:18:41.301777 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:18:41.306571 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:18:41.306543 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:18:41.310547 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:18:41.310528 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:18:41.312435 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:18:41.312420 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:23:41.327871 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:23:41.327759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:23:41.331842 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:23:41.328661 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:23:41.334618 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:23:41.334601 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:23:41.335589 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:23:41.335572 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:28:41.348382 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:28:41.348285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:28:41.355106 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:28:41.349811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:28:41.355106 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:28:41.353806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:28:41.355724 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:28:41.355707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:33:41.368928 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:33:41.368806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:33:41.373468 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:33:41.373447 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:33:41.375120 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:33:41.375101 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:33:41.379211 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:33:41.379195 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:38:41.387329 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:38:41.387221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:38:41.393242 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:38:41.393223 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:38:41.393369 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:38:41.393288 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:38:41.398508 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:38:41.398493 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:43:41.406685 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:43:41.406655 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:43:41.412416 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:43:41.412394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:43:41.412590 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:43:41.412575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:43:41.418448 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:43:41.418432 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:47:47.869451 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.869415 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xlhgw/must-gather-kk475"] Apr 22 20:47:47.869846 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.869718 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" Apr 22 20:47:47.869846 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.869729 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" Apr 22 20:47:47.869846 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.869743 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="storage-initializer" Apr 22 20:47:47.869846 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.869749 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="storage-initializer" Apr 22 20:47:47.869846 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.869798 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2daa5f1-6dc2-44eb-82f9-cef97d519395" containerName="kserve-container" Apr 22 20:47:47.872675 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.872654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:47.875321 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.875302 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xlhgw\"/\"openshift-service-ca.crt\"" Apr 22 20:47:47.875424 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.875303 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xlhgw\"/\"kube-root-ca.crt\"" Apr 22 20:47:47.876734 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.876713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xlhgw\"/\"default-dockercfg-pm84g\"" Apr 22 20:47:47.881888 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.881868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlhgw/must-gather-kk475"] Apr 22 20:47:47.994808 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.994785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2ef003a-b290-425e-aabd-f24e8543458a-must-gather-output\") pod \"must-gather-kk475\" (UID: \"c2ef003a-b290-425e-aabd-f24e8543458a\") " pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:47.994944 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:47.994854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6r5l\" (UniqueName: \"kubernetes.io/projected/c2ef003a-b290-425e-aabd-f24e8543458a-kube-api-access-m6r5l\") pod \"must-gather-kk475\" (UID: \"c2ef003a-b290-425e-aabd-f24e8543458a\") " pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:48.095942 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.095901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6r5l\" (UniqueName: \"kubernetes.io/projected/c2ef003a-b290-425e-aabd-f24e8543458a-kube-api-access-m6r5l\") pod \"must-gather-kk475\" (UID: \"c2ef003a-b290-425e-aabd-f24e8543458a\") " pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:48.096037 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.095954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2ef003a-b290-425e-aabd-f24e8543458a-must-gather-output\") pod \"must-gather-kk475\" (UID: \"c2ef003a-b290-425e-aabd-f24e8543458a\") " pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:48.096221 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.096207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2ef003a-b290-425e-aabd-f24e8543458a-must-gather-output\") pod \"must-gather-kk475\" (UID: \"c2ef003a-b290-425e-aabd-f24e8543458a\") " pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:48.103621 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.103596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6r5l\" (UniqueName: \"kubernetes.io/projected/c2ef003a-b290-425e-aabd-f24e8543458a-kube-api-access-m6r5l\") pod \"must-gather-kk475\" (UID: \"c2ef003a-b290-425e-aabd-f24e8543458a\") " pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:48.188296 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.188274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlhgw/must-gather-kk475" Apr 22 20:47:48.301931 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.301892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlhgw/must-gather-kk475"] Apr 22 20:47:48.304452 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:47:48.304429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ef003a_b290_425e_aabd_f24e8543458a.slice/crio-dbfd0d4c221423b5c79a4b8a5fef1d33bb258d82926fba31f91e7352bc7309e1 WatchSource:0}: Error finding container dbfd0d4c221423b5c79a4b8a5fef1d33bb258d82926fba31f91e7352bc7309e1: Status 404 returned error can't find the container with id dbfd0d4c221423b5c79a4b8a5fef1d33bb258d82926fba31f91e7352bc7309e1 Apr 22 20:47:48.306043 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.306027 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:47:48.553618 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:48.553540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/must-gather-kk475" event={"ID":"c2ef003a-b290-425e-aabd-f24e8543458a","Type":"ContainerStarted","Data":"dbfd0d4c221423b5c79a4b8a5fef1d33bb258d82926fba31f91e7352bc7309e1"} Apr 22 20:47:49.560380 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:49.560345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/must-gather-kk475" event={"ID":"c2ef003a-b290-425e-aabd-f24e8543458a","Type":"ContainerStarted","Data":"0eb66da3f1df2647b61bfa2581ef6a5546fbec4f7bb3848be8d766dbb766dafe"} Apr 22 20:47:49.560785 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:49.560389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/must-gather-kk475" event={"ID":"c2ef003a-b290-425e-aabd-f24e8543458a","Type":"ContainerStarted","Data":"664a3d6d3e8303c4c52c75f207eb30b80fa107f02689fd5a475fcc0144278efa"} Apr 22 20:47:49.575344 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:49.575270 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xlhgw/must-gather-kk475" podStartSLOduration=1.8067514930000002 podStartE2EDuration="2.575255342s" podCreationTimestamp="2026-04-22 20:47:47 +0000 UTC" firstStartedPulling="2026-04-22 20:47:48.306149808 +0000 UTC m=+2947.503470673" lastFinishedPulling="2026-04-22 20:47:49.074653656 +0000 UTC m=+2948.271974522" observedRunningTime="2026-04-22 20:47:49.574511228 +0000 UTC m=+2948.771832113" watchObservedRunningTime="2026-04-22 20:47:49.575255342 +0000 UTC m=+2948.772576224" Apr 22 20:47:50.417410 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:50.417381 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kpxqh_d3c9c0b5-cfb0-4848-beb9-0dd2fcb635d3/global-pull-secret-syncer/0.log" Apr 22 20:47:50.522084 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:50.522055 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hjrqd_0ea8eea5-b4b4-40f6-a297-6b5a87a93a8c/konnectivity-agent/0.log" Apr 22 20:47:50.598619 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:50.598588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-160.ec2.internal_c49858937a727d87766821f66c42c625/haproxy/0.log" Apr 22 20:47:54.269775 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.269607 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-d2pwr_8dfd4ef7-39de-4af8-be99-d4e588c62cf8/cluster-monitoring-operator/0.log" Apr 22 20:47:54.369631 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.369556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c5d448c7f-h5wxd_f424c575-8486-4afb-9b1a-71a9d4777588/metrics-server/0.log" Apr 22 20:47:54.395467 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.395440 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-h72kt_ea8a3d9c-c37d-4843-a542-71f8f152e29c/monitoring-plugin/0.log" Apr 22 20:47:54.425753 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.425725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f77r8_f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7/node-exporter/0.log" Apr 22 20:47:54.446872 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.446753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f77r8_f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7/kube-rbac-proxy/0.log" Apr 22 20:47:54.472649 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.472622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f77r8_f4eda699-e8d5-47b6-b1a3-62d6fc0c3fc7/init-textfile/0.log" Apr 22 20:47:54.643622 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.643517 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vsgb5_9187cac0-ce5c-42b4-93e0-7efd963a5632/kube-rbac-proxy-main/0.log" Apr 22 20:47:54.670887 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.670857 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vsgb5_9187cac0-ce5c-42b4-93e0-7efd963a5632/kube-rbac-proxy-self/0.log" Apr 22 20:47:54.691593 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.691567 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vsgb5_9187cac0-ce5c-42b4-93e0-7efd963a5632/openshift-state-metrics/0.log" Apr 22 20:47:54.889362 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.889332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dhg8l_5ddc495d-0284-42b1-b944-aa586284e1fa/prometheus-operator/0.log" Apr 22 20:47:54.908568 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:54.908545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dhg8l_5ddc495d-0284-42b1-b944-aa586284e1fa/kube-rbac-proxy/0.log" Apr 22 20:47:55.037456 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:55.037430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/thanos-query/0.log" Apr 22 20:47:55.061148 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:55.061124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy-web/0.log" Apr 22 20:47:55.087801 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:55.087776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy/0.log" Apr 22 20:47:55.112162 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:55.112137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/prom-label-proxy/0.log" Apr 22 20:47:55.136810 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:55.136777 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy-rules/0.log" Apr 22 20:47:55.164666 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:55.164586 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bcc7bcbdf-dx4zz_3367fec7-9706-4628-83b7-af7a1e3cc219/kube-rbac-proxy-metrics/0.log" Apr 22 20:47:56.608407 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:56.608375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/2.log" Apr 22 20:47:56.614888 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:56.614861 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4v6ps_d677224f-4cd1-4503-9169-7d8613c67e70/console-operator/3.log" Apr 22 20:47:57.359645 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.359619 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-tvm9g_2d086995-4405-4443-9eb1-d1e194ba6492/volume-data-source-validator/0.log" Apr 22 20:47:57.492689 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.492660 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f"] Apr 22 20:47:57.497208 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.497184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.505152 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.505130 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f"] Apr 22 20:47:57.582794 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.582756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-proc\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.582794 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.582790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-podres\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.583046 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.582821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-sys\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.583046 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.582893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j6n5\" (UniqueName: \"kubernetes.io/projected/34799a3b-aa2d-4e9d-aac7-49cc31780f97-kube-api-access-9j6n5\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.583046 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.583011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-lib-modules\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.683624 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.683585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-lib-modules\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.684414 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.684390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-proc\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.684653 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.684637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-podres\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.684891 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.684872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-sys\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.685130 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.685113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j6n5\" (UniqueName: \"kubernetes.io/projected/34799a3b-aa2d-4e9d-aac7-49cc31780f97-kube-api-access-9j6n5\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.685579 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.684821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-podres\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.685579 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.684278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-lib-modules\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.685729 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.685056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-sys\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.685729 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.684586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/34799a3b-aa2d-4e9d-aac7-49cc31780f97-proc\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.692731 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.692704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j6n5\" (UniqueName: \"kubernetes.io/projected/34799a3b-aa2d-4e9d-aac7-49cc31780f97-kube-api-access-9j6n5\") pod \"perf-node-gather-daemonset-zf85f\" (UID: \"34799a3b-aa2d-4e9d-aac7-49cc31780f97\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.808028 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.807998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:57.948115 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:57.948083 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f"] Apr 22 20:47:57.950215 ip-10-0-128-160 kubenswrapper[2575]: W0422 20:47:57.950187 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod34799a3b_aa2d_4e9d_aac7_49cc31780f97.slice/crio-efe4069a04fd6e8dccfed93c8f996c9f18f58ec4ca797de2eb0f4c4f50cf4464 WatchSource:0}: Error finding container efe4069a04fd6e8dccfed93c8f996c9f18f58ec4ca797de2eb0f4c4f50cf4464: Status 404 returned error can't find the container with id efe4069a04fd6e8dccfed93c8f996c9f18f58ec4ca797de2eb0f4c4f50cf4464 Apr 22 20:47:58.040689 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.040668 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dvs95_ddc58e3d-3248-48a6-990e-23900cf6ae7f/dns/0.log" Apr 22 20:47:58.059849 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.059832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dvs95_ddc58e3d-3248-48a6-990e-23900cf6ae7f/kube-rbac-proxy/0.log" Apr 22 20:47:58.123908 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.123892 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hrntb_7047d9ee-3de6-4d53-a8d2-f3e0ea19b774/dns-node-resolver/0.log" Apr 22 20:47:58.549594 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.549569 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-52sgh_a09841fa-cd66-4311-9794-2efee23e5727/node-ca/0.log" Apr 22 20:47:58.592062 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.592032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" event={"ID":"34799a3b-aa2d-4e9d-aac7-49cc31780f97","Type":"ContainerStarted","Data":"b30fbd78603bba0f35f4a8d2f5e103d9b99937407c19d9ad1fdeb79bc52f35d5"} Apr 22 20:47:58.592220 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.592071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" event={"ID":"34799a3b-aa2d-4e9d-aac7-49cc31780f97","Type":"ContainerStarted","Data":"efe4069a04fd6e8dccfed93c8f996c9f18f58ec4ca797de2eb0f4c4f50cf4464"} Apr 22 20:47:58.592220 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.592171 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:47:58.607310 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:58.607252 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" podStartSLOduration=1.607236208 podStartE2EDuration="1.607236208s" podCreationTimestamp="2026-04-22 20:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:47:58.605836769 +0000 UTC m=+2957.803157651" watchObservedRunningTime="2026-04-22 20:47:58.607236208 +0000 UTC m=+2957.804557091" Apr 22 20:47:59.253421 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:59.253384 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-c4cc68cf4-99gts_eb61edaa-0f4d-4c21-a6c8-60949c33b5d1/router/0.log" Apr 22 20:47:59.588174 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:59.588105 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lz5pr_cfae59ae-9e91-4cd6-b825-4209ded69c88/serve-healthcheck-canary/0.log" Apr 22 20:47:59.966550 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:59.966519 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dj9hh_b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf/kube-rbac-proxy/0.log" Apr 22 20:47:59.985701 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:47:59.985674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dj9hh_b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf/exporter/0.log" Apr 22 20:48:00.005740 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:00.005711 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dj9hh_b2af54cc-d1fd-48ec-b7ae-3c41f6c630cf/extractor/0.log" Apr 22 20:48:04.609793 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:04.609766 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-zf85f" Apr 22 20:48:07.306731 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.306702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/kube-multus-additional-cni-plugins/0.log" Apr 22 20:48:07.327755 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.327726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/egress-router-binary-copy/0.log" Apr 22 20:48:07.347504 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.347482 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/cni-plugins/0.log" Apr 22 20:48:07.365631 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.365614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/bond-cni-plugin/0.log" Apr 22 20:48:07.385859 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.385840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/routeoverride-cni/0.log" Apr 22 20:48:07.403337 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.403321 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/whereabouts-cni-bincopy/0.log" Apr 22 20:48:07.421212 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.421198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vn8pw_bd664ecc-6372-4249-9d3f-4ca9da5c429c/whereabouts-cni/0.log" Apr 22 20:48:07.445837 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.445818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v7xx6_ec02ec42-2641-4c07-aa33-0277c20a77a7/kube-multus/0.log" Apr 22 20:48:07.569719 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.569667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rw25k_7c3e7957-917d-44e3-8833-76ccc4a5d167/network-metrics-daemon/0.log" Apr 22 20:48:07.587154 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:07.587135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rw25k_7c3e7957-917d-44e3-8833-76ccc4a5d167/kube-rbac-proxy/0.log" Apr 22 20:48:08.546081 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.546011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-controller/0.log" Apr 22 20:48:08.565622 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.565597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/0.log" Apr 22 20:48:08.581781 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.581758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovn-acl-logging/1.log" Apr 22 20:48:08.601025 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.601002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/kube-rbac-proxy-node/0.log" Apr 22 20:48:08.622980 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.622955 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:48:08.640288 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.640273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/northd/0.log" Apr 22 20:48:08.659549 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.659525 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/nbdb/0.log" Apr 22 20:48:08.680987 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.680964 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/sbdb/0.log" Apr 22 20:48:08.806983 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:08.806891 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jw84m_8bc4cd36-7e91-452b-b1f8-8c4ebd9f39ed/ovnkube-controller/0.log" Apr 22 20:48:09.961149 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:09.961119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9tw8t_dd53ea43-190e-42c7-b4f7-20127893755e/network-check-target-container/0.log" Apr 22 20:48:10.788309 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:10.788279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5sz4q_c3fdb762-05cf-4828-8c3e-45c50aca2528/iptables-alerter/0.log" Apr 22 20:48:11.397569 ip-10-0-128-160 kubenswrapper[2575]: I0422 20:48:11.397545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8wmlw_0155db59-b48d-4919-97a6-8855d675375d/tuned/0.log"