Apr 22 18:43:56.390364 ip-10-0-129-249 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:43:56.390378 ip-10-0-129-249 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:43:56.390386 ip-10-0-129-249 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:43:56.390638 ip-10-0-129-249 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:44:06.492090 ip-10-0-129-249 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:44:06.492107 ip-10-0-129-249 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 751cd99d2a6b45fabdc84898163f1f30 -- Apr 22 18:46:36.160051 ip-10-0-129-249 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:36.595439 ip-10-0-129-249 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:36.595439 ip-10-0-129-249 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:36.595439 ip-10-0-129-249 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:36.595439 ip-10-0-129-249 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:36.595439 ip-10-0-129-249 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:36.597143 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.597015 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:36.603549 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603529 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:36.603549 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603546 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:36.603549 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603551 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:36.603549 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603554 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:36.603549 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603557 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:36.603549 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603559 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603562 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603565 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603569 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603571 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603574 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603577 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603580 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603582 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603585 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603587 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603590 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603592 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603595 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603598 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603600 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603603 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603605 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603608 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603611 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:36.603757 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603613 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603616 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603626 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603628 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603632 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603635 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603637 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603641 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603643 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603646 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603648 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603651 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603653 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603656 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603659 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603661 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603664 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603667 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603669 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603672 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:36.604266 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603674 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603677 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603680 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603682 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603685 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603687 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603690 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603693 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603695 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603698 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603700 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603704 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603708 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603711 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603715 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603718 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603721 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603724 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603727 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:36.604851 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603730 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603732 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603735 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603737 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603740 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603742 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603745 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603749 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603751 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603754 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603757 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603759 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603762 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603765 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603769 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603771 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603774 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603776 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603779 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603781 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:36.605315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603800 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.603803 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605603 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605611 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605614 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605618 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605621 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605624 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605628 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605630 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605633 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605636 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605638 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605641 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605644 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605646 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605649 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605651 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605654 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605657 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:36.605811 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605659 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605661 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605664 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605666 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605669 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605671 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605674 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605676 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605679 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605681 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605684 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605686 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605688 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605691 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605694 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605696 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605699 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605701 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605704 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605706 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:36.606293 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605709 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605711 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605713 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605717 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605720 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605723 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605726 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605728 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605731 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605734 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605736 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605739 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605741 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605744 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605746 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605749 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605751 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605753 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605755 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605758 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:36.606795 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605761 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605763 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605765 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605768 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605771 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605774 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605777 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605779 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605782 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605802 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605805 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605808 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605811 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605813 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605816 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605819 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605822 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605825 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605830 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:36.607283 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605834 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605837 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605840 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605843 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605845 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605848 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605850 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605853 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.605855 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605942 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605949 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605956 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605960 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605965 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605968 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605972 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605977 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605980 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605983 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605987 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605990 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605994 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:36.607751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.605997 2581 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606000 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606003 2581 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606005 2581 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606008 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606011 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606015 2581 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606018 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606021 2581 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606024 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606027 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606031 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606034 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606037 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606040 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606043 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606046 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606049 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606053 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606055 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606059 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606062 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606065 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606068 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606072 2581 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:36.608319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606075 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606079 2581 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606082 2581 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606085 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606089 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606092 2581 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606095 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606098 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606101 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606104 2581 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606107 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606109 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606112 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606123 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606126 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606129 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606132 2581 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606136 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606139 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606142 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606145 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606148 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606152 2581 flags.go:64] FLAG: --help="false" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606154 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606157 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:36.608927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606160 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606163 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606166 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606170 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606172 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606175 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606178 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606181 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606184 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606187 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606190 2581 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606193 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606196 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606199 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606202 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606205 2581 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606207 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606210 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606213 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606218 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606221 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606224 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606226 2581 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:36.609562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606229 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606233 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606236 2581 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606238 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606243 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606245 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606249 2581 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606252 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606255 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606258 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606260 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606263 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606266 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606269 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606276 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606280 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606283 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606286 2581 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606289 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606295 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606298 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606302 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606305 2581 flags.go:64] FLAG: --port="10250" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606308 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:36.610125 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606311 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0242d323d5c1ca6d6" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606314 2581 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606317 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606320 2581 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606322 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606325 2581 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606329 2581 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606332 2581 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606335 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606337 2581 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606341 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606344 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606347 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606349 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606352 2581 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606355 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606358 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606361 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606363 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606378 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606381 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606385 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606388 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606391 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606394 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606397 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:36.610700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606400 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606403 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606406 2581 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606409 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606414 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606417 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606420 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606424 2581 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606426 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606429 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606432 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606435 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606437 2581 flags.go:64] FLAG: --v="2" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606441 2581 flags.go:64] FLAG: --version="false" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606448 2581 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606452 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.606456 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606543 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606546 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606550 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606552 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606555 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606558 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606561 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:36.611362 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606564 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606566 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606569 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606572 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606574 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606577 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606579 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606582 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606584 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606591 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606593 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606596 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606598 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606601 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606603 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606606 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606608 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606611 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606613 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606616 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:36.611953 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606618 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606621 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606625 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606627 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606630 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606633 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606636 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606638 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606641 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606643 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606646 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606649 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606651 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606654 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606656 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606659 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606661 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606664 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606666 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606669 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:36.612474 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606671 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606675 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606678 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606681 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606683 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606686 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606688 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606691 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606693 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606696 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606698 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606700 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606703 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606705 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606709 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606712 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606714 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606717 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606719 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:36.613044 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606723 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606727 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606730 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606732 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606734 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606737 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606739 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606742 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606744 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606746 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606749 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606751 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606754 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606756 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606760 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606762 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606765 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606769 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606772 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:36.613584 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.606775 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:36.614130 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.607700 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:36.614972 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.614955 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:36.615011 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.614973 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615018 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615024 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615027 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615031 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615034 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615038 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615040 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:36.615040 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615043 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615047 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615050 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615053 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615055 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615060 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615064 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615067 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615069 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615072 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615074 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615077 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615080 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615083 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615085 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615087 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615090 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615093 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615095 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615098 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:36.615271 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615100 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615102 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615105 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615108 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615111 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615113 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615116 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615118 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615121 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615123 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615125 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615128 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615130 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615133 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615136 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615139 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615142 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615144 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615147 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:36.615900 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615149 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615152 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615155 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615157 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615160 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615163 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615165 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615168 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615170 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615172 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615175 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615177 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615180 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615182 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615184 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615187 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615189 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615192 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615195 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:36.616455 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615197 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615201 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615205 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615208 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615211 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615213 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615216 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615218 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615222 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615224 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615227 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615230 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615232 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615235 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615237 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615240 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615242 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615245 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615247 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:36.616983 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615250 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615252 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.615257 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615370 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615375 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615378 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615381 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615383 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615386 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615389 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615391 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615394 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615396 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615400 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615402 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615405 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:36.617466 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615407 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615409 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615412 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615415 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615417 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615420 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615423 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615426 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615428 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615431 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615433 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615436 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615438 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615440 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615443 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615445 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615448 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615450 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615452 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615455 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:36.617885 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615457 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615460 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615462 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615465 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615467 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615470 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615472 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615475 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615477 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615480 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615482 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615485 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615487 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615489 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615492 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615494 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615497 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615499 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615503 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:36.618386 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615507 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615510 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615512 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615515 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615517 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615520 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615522 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615525 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615527 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615530 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615532 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615535 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615537 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615540 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615542 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615545 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615547 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615550 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615552 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615555 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:36.618866 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615557 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615560 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615562 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615565 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615567 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615570 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615572 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615575 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615577 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615579 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615583 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615587 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615589 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:36.615592 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.615596 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:36.619343 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.616348 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:36.619914 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.619901 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:36.620881 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.620869 2581 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:36.620983 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.620967 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:36.621012 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.621006 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:36.646912 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.646895 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:36.654027 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.654005 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:36.669211 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.669190 2581 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:36.673658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.673640 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:36.674843 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.674824 2581 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:36.676197 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.676185 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:36.680780 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.680759 2581 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7ac1503e-dc74-4f7d-9a0e-11ac4c6dd69b:/dev/nvme0n1p3 a3174170-ed4a-4f24-b20c-1bdfd9f9dab3:/dev/nvme0n1p4] Apr 22 18:46:36.680839 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.680781 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:36.686368 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.686270 2581 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:36.684378642 +0000 UTC m=+0.407138189 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100178 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b5006f69b83caf2d75e406f83fd44 SystemUUID:ec2b5006-f69b-83ca-f2d7-5e406f83fd44 BootID:751cd99d-2a6b-45fa-bdc8-4898163f1f30 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3c:bd:67:d7:a3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3c:bd:67:d7:a3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:6c:e2:5d:06:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:36.686368 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.686365 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:36.686459 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.686438 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:36.687474 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.687455 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:36.687620 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.687477 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-249.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:36.687663 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.687629 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:36.687663 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.687653 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:36.687719 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.687666 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:36.688384 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.688373 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:36.689726 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.689716 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:36.689861 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.689852 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:36.692481 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.692470 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:36.692514 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.692490 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:36.692514 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.692501 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:36.692514 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.692509 2581 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:36.692602 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.692517 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:36.693652 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.693638 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:36.693698 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.693667 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:36.695422 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.695400 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5rfxz" Apr 22 18:46:36.696742 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.696727 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:36.698164 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.698150 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:36.700251 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700237 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:36.700251 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700253 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700267 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700272 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700279 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700284 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700289 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700296 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700303 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700308 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700321 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:36.700358 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.700330 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:36.701147 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.701131 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:36.701204 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.701148 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:36.701204 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.701139 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5rfxz" Apr 22 18:46:36.706541 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.706331 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:36.706670 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.706570 2581 server.go:1295] "Started kubelet" Apr 22 18:46:36.706728 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.706658 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:36.706781 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.706697 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:36.706781 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.706759 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:36.707376 ip-10-0-129-249 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:36.708751 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.708568 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:36.708972 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.708958 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:36.710108 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.710093 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:36.712582 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.712456 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:36.714456 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.714437 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:36.714456 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.714444 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:36.715099 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715078 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:36.715206 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715099 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:36.715265 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715245 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:36.715265 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.715251 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-249.ec2.internal\" not found" Apr 22 18:46:36.715367 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715316 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:36.715367 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715324 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:36.715367 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715334 2581 factory.go:55] Registering systemd factory Apr 22 18:46:36.715367 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715347 2581 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:36.715684 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715667 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-249.ec2.internal" not found Apr 22 18:46:36.715750 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715743 2581 factory.go:153] Registering CRI-O factory Apr 22 18:46:36.715827 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715757 2581 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:36.715886 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715847 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:36.715886 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715872 2581 factory.go:103] Registering Raw factory Apr 22 18:46:36.715981 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.715888 2581 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:36.716482 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.716425 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:36.716771 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.716755 2581 manager.go:319] Starting recovery of all containers Apr 22 18:46:36.717621 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.717596 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:36.720740 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.720719 2581 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-249.ec2.internal\" not found" node="ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.727330 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.727308 2581 manager.go:324] Recovery completed Apr 22 18:46:36.732420 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.732273 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-249.ec2.internal" not found Apr 22 18:46:36.732542 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.732531 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:36.734284 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.734269 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:36.734348 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.734299 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:36.734348 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.734310 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:36.734726 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.734707 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:36.734726 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.734723 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:36.734816 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.734738 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:36.737339 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.737327 2581 policy_none.go:49] "None policy: Start" Apr 22 18:46:36.737374 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.737343 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:36.737374 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.737353 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:36.773306 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773291 2581 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:36.773384 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.773323 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:36.773384 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773336 2581 server.go:85] "Starting device plugin registration server" Apr 22 18:46:36.773564 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773551 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:36.773618 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773565 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:36.773672 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773653 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:36.773750 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773735 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:36.773750 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.773748 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:36.774193 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.774176 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:36.774280 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.774207 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-249.ec2.internal\" not found" Apr 22 18:46:36.794214 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.794200 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-249.ec2.internal" not found Apr 22 18:46:36.856148 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.856085 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:36.857466 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.857448 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:36.857546 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.857483 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:36.857546 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.857503 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:36.857546 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.857514 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:36.857680 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:36.857553 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:36.860175 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.860155 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:36.874159 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.874145 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:36.874860 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.874845 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:36.874944 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.874876 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:36.874944 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.874890 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:36.874944 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.874920 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.884045 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.884031 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.958512 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.958490 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal"] Apr 22 18:46:36.960847 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.960831 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.960925 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.960900 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.988434 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.988417 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" Apr 22 18:46:36.991854 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:36.991842 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.002823 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.002804 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:37.004696 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.004681 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:37.016591 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.016569 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/894b23123d23c1b0610c463e2e9162b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal\" (UID: \"894b23123d23c1b0610c463e2e9162b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.116767 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.116708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/894b23123d23c1b0610c463e2e9162b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal\" (UID: \"894b23123d23c1b0610c463e2e9162b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.116767 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.116743 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0aa354c0cbfbaf0036d5f596d6e0335c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-249.ec2.internal\" (UID: \"0aa354c0cbfbaf0036d5f596d6e0335c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.116891 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.116782 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/894b23123d23c1b0610c463e2e9162b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal\" (UID: \"894b23123d23c1b0610c463e2e9162b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.116891 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.116829 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/894b23123d23c1b0610c463e2e9162b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal\" (UID: \"894b23123d23c1b0610c463e2e9162b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.217402 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.217379 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/894b23123d23c1b0610c463e2e9162b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal\" (UID: \"894b23123d23c1b0610c463e2e9162b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.217473 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.217405 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0aa354c0cbfbaf0036d5f596d6e0335c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-249.ec2.internal\" (UID: \"0aa354c0cbfbaf0036d5f596d6e0335c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.217473 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.217441 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/894b23123d23c1b0610c463e2e9162b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal\" (UID: \"894b23123d23c1b0610c463e2e9162b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.217545 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.217485 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0aa354c0cbfbaf0036d5f596d6e0335c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-249.ec2.internal\" (UID: \"0aa354c0cbfbaf0036d5f596d6e0335c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.306370 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.306349 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.307408 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.307393 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" Apr 22 18:46:37.620990 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.620963 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:37.621501 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.621095 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:37.621501 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.621118 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:37.621501 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.621120 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:37.693033 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.693010 2581 apiserver.go:52] "Watching apiserver" Apr 22 18:46:37.701390 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.701375 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:37.703299 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.703272 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:36 +0000 UTC" deadline="2027-09-29 21:16:40.910831544 +0000 UTC" Apr 22 18:46:37.703345 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.703298 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12602h30m3.207535437s" Apr 22 18:46:37.703442 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.703423 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-b4wkh","openshift-ovn-kubernetes/ovnkube-node-b9cf6","kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k","openshift-cluster-node-tuning-operator/tuned-26q7g","openshift-image-registry/node-ca-grdm6","openshift-multus/network-metrics-daemon-z9kwg","openshift-network-operator/iptables-alerter-qh2jx","kube-system/konnectivity-agent-dngnr","openshift-dns/node-resolver-sp2sj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal","openshift-multus/multus-additional-cni-plugins-dhp4b","openshift-multus/multus-dvr84"] Apr 22 18:46:37.705809 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.705777 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:37.705904 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.705867 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:37.708216 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.708199 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.710409 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.710387 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.711108 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.711089 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.711238 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.711223 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:37.711295 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.711276 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:37.711386 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.711368 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:37.713008 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.712114 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.713008 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.712215 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:37.713008 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.712913 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sbgr7\"" Apr 22 18:46:37.713191 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.713131 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:37.713191 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.713180 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pd4b9\"" Apr 22 18:46:37.713398 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.713380 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.713618 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.713597 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.714603 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.714587 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:37.715235 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.715217 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.715414 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.715361 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.717428 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.717412 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fvffw\"" Apr 22 18:46:37.717729 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.717710 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.717834 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.717728 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.717894 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.717833 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zv54v\"" Apr 22 18:46:37.717894 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.717839 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:37.718033 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.718019 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.718646 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.718629 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.719877 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.719861 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:37.719964 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.719921 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:37.719964 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.719963 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.721640 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721619 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-cni-bin\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.721733 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721658 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.721733 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721684 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovnkube-config\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.721733 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovnkube-script-lib\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721732 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-sys\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721760 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-env-overrides\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721808 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-device-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721831 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysconfig\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721849 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-systemd\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721885 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96308ab5-cbfb-459e-9e75-9d548626286b-host\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721918 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7qvp\" (UniqueName: \"kubernetes.io/projected/47eec246-c244-4918-8600-48de7568588b-kube-api-access-f7qvp\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:37.721943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721944 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-run-netns\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.721974 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-var-lib-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722042 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-log-socket\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722077 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovn-node-metrics-cert\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722113 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-sys-fs\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722137 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-kubernetes\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722166 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-lib-modules\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722191 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-var-lib-kubelet\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722214 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722259 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-cni-netd\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722299 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6cpk\" (UniqueName: \"kubernetes.io/projected/b77cfc29-6e5e-47f5-b607-aa33e5a172af-kube-api-access-v6cpk\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722338 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcsjs\" (UniqueName: \"kubernetes.io/projected/88342289-d6b0-4f23-a7a4-e1b94386e991-kube-api-access-fcsjs\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722374 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysctl-d\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722400 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-tuned\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722422 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbz6p\" (UniqueName: \"kubernetes.io/projected/4f604274-eb1a-4b2d-865b-59dbe9dc8461-kube-api-access-hbz6p\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722450 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722473 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-slash\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722527 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-host\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722551 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96308ab5-cbfb-459e-9e75-9d548626286b-serviceca\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722575 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwbf9\" (UniqueName: \"kubernetes.io/projected/96308ab5-cbfb-459e-9e75-9d548626286b-kube-api-access-qwbf9\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722605 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722620 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-etc-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722648 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-ovn\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722666 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-etc-selinux\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722680 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-modprobe-d\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722693 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-run\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f604274-eb1a-4b2d-865b-59dbe9dc8461-tmp\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-socket-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722743 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-registration-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.722931 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722834 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-kubelet\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722878 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-systemd-units\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722916 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-run-ovn-kubernetes\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722961 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.722988 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-systemd\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.723025 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-node-log\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.723050 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysctl-conf\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.723071 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.723502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.723074 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:37.723882 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.723698 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-g9xd4\"" Apr 22 18:46:37.723882 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.723698 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.725399 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.725369 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.725604 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.725582 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:37.725825 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.725812 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:37.725896 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.725886 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8lqh6\"" Apr 22 18:46:37.727456 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.727436 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.727701 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.727683 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.728214 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.728196 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lmkzb\"" Apr 22 18:46:37.728299 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.728202 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.729979 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.729963 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:37.730084 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730070 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:37.730145 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730126 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4jknq\"" Apr 22 18:46:37.730145 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730136 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.730750 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730733 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:37.730857 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730814 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:37.730857 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730839 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:37.730971 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.730862 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:37.732561 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.732546 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:37.732719 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.732706 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9whsv\"" Apr 22 18:46:37.752111 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.752094 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-d2rz2" Apr 22 18:46:37.760170 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.760154 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-d2rz2" Apr 22 18:46:37.816105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.816087 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:37.823736 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-cni-multus\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.823829 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823751 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-kubelet\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.823829 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823809 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-registration-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823840 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-kubelet\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823844 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-registration-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823864 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-systemd-units\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823884 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-run-ovn-kubernetes\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823891 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-kubelet\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823907 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-conf-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.823927 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823920 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-systemd-units\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-systemd\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823957 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysctl-conf\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823977 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-run-ovn-kubernetes\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.823982 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824035 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-system-cni-dir\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824064 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-cnibin\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824066 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysctl-conf\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.824091 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824103 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824159 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-systemd\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.824172 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.324130619 +0000 UTC m=+2.046890177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:37.824196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-sys\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824209 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824242 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-sys\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824216 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3453260f-3618-4209-b141-058bfe076e0c-tmp-dir\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824285 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824313 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/add2dc4a-bd5c-417c-91dd-132eb3de7087-cni-binary-copy\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824341 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-env-overrides\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824365 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-systemd\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824387 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96308ab5-cbfb-459e-9e75-9d548626286b-host\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824415 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdwm\" (UniqueName: \"kubernetes.io/projected/697a8ed9-86fe-434b-9bc5-3296e657ff3d-kube-api-access-2cdwm\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824430 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-systemd\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824441 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b456252b-c126-48fd-ba56-9b92b64d07ce-konnectivity-ca\") pod \"konnectivity-agent-dngnr\" (UID: \"b456252b-c126-48fd-ba56-9b92b64d07ce\") " pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824469 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-var-lib-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824474 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96308ab5-cbfb-459e-9e75-9d548626286b-host\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824508 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-kubernetes\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824519 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-var-lib-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824534 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.824707 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824557 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6cpk\" (UniqueName: \"kubernetes.io/projected/b77cfc29-6e5e-47f5-b607-aa33e5a172af-kube-api-access-v6cpk\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824569 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-kubernetes\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824580 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcsjs\" (UniqueName: \"kubernetes.io/projected/88342289-d6b0-4f23-a7a4-e1b94386e991-kube-api-access-fcsjs\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824602 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysctl-d\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824609 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824625 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbz6p\" (UniqueName: \"kubernetes.io/projected/4f604274-eb1a-4b2d-865b-59dbe9dc8461-kube-api-access-hbz6p\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824652 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-cnibin\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-socket-dir-parent\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-host\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824721 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysctl-d\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824730 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824809 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-host\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824860 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-env-overrides\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824961 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-etc-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.824999 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-etc-selinux\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825025 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-modprobe-d\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825072 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-etc-openvswitch\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.825489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825119 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-modprobe-d\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825163 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-etc-selinux\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825198 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-cni-binary-copy\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825222 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825246 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpg7f\" (UniqueName: \"kubernetes.io/projected/63aaeb86-a51a-4444-93df-19041d851cd6-kube-api-access-fpg7f\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825272 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-system-cni-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825301 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-socket-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825327 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825353 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-os-release\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-cni-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825401 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-hostroot\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825430 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-node-log\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825455 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b456252b-c126-48fd-ba56-9b92b64d07ce-agent-certs\") pod \"konnectivity-agent-dngnr\" (UID: \"b456252b-c126-48fd-ba56-9b92b64d07ce\") " pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825477 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825482 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-cni-bin\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825480 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-socket-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825516 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovnkube-config\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.826423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825533 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovnkube-script-lib\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825533 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-cni-bin\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825550 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4xz\" (UniqueName: \"kubernetes.io/projected/3453260f-3618-4209-b141-058bfe076e0c-kube-api-access-vb4xz\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825551 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-node-log\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825654 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/697a8ed9-86fe-434b-9bc5-3296e657ff3d-iptables-alerter-script\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825695 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-daemon-config\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825711 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-device-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825732 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysconfig\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825757 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qvp\" (UniqueName: \"kubernetes.io/projected/47eec246-c244-4918-8600-48de7568588b-kube-api-access-f7qvp\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825783 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-cni-bin\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-sysconfig\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825862 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-multus-certs\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825906 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-tuned\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825980 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-run-netns\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.825990 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-device-dir\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-log-socket\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826029 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovnkube-config\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826034 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-run-netns\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826072 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovn-node-metrics-cert\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826089 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-log-socket\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826102 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovnkube-script-lib\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826131 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-sys-fs\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826233 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-lib-modules\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826261 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88342289-d6b0-4f23-a7a4-e1b94386e991-sys-fs\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-var-lib-kubelet\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826313 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-k8s-cni-cncf-io\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-var-lib-kubelet\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826363 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-cni-netd\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826389 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-os-release\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826290 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826405 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-lib-modules\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqn9\" (UniqueName: \"kubernetes.io/projected/add2dc4a-bd5c-417c-91dd-132eb3de7087-kube-api-access-twqn9\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826423 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-cni-netd\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826463 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826507 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-slash\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.827658 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826535 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96308ab5-cbfb-459e-9e75-9d548626286b-serviceca\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwbf9\" (UniqueName: \"kubernetes.io/projected/96308ab5-cbfb-459e-9e75-9d548626286b-kube-api-access-qwbf9\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826605 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-host-slash\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826642 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3453260f-3618-4209-b141-058bfe076e0c-hosts-file\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/697a8ed9-86fe-434b-9bc5-3296e657ff3d-host-slash\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826701 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-netns\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826724 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-etc-kubernetes\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826765 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-ovn\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-run\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826861 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f604274-eb1a-4b2d-865b-59dbe9dc8461-tmp\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826899 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f604274-eb1a-4b2d-865b-59dbe9dc8461-run\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826861 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b77cfc29-6e5e-47f5-b607-aa33e5a172af-run-ovn\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.828228 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.826977 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96308ab5-cbfb-459e-9e75-9d548626286b-serviceca\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.829258 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.829234 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f604274-eb1a-4b2d-865b-59dbe9dc8461-etc-tuned\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.829348 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.829290 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f604274-eb1a-4b2d-865b-59dbe9dc8461-tmp\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.829458 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.829440 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b77cfc29-6e5e-47f5-b607-aa33e5a172af-ovn-node-metrics-cert\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.834205 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.834189 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:37.834274 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.834208 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:37.834274 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.834217 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:37.834274 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:37.834255 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.334244286 +0000 UTC m=+2.057003820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:37.836901 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:37.836878 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa354c0cbfbaf0036d5f596d6e0335c.slice/crio-bf6bb00eec21a71cd74e76a2d5a6047b8b734b1687edacc07f523b9a5c891bb0 WatchSource:0}: Error finding container bf6bb00eec21a71cd74e76a2d5a6047b8b734b1687edacc07f523b9a5c891bb0: Status 404 returned error can't find the container with id bf6bb00eec21a71cd74e76a2d5a6047b8b734b1687edacc07f523b9a5c891bb0 Apr 22 18:46:37.837077 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:37.837062 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod894b23123d23c1b0610c463e2e9162b9.slice/crio-a9cf5f80c59f3bd4eb7ba274045754c5c79afd4a560043682c491d89c6a4ef79 WatchSource:0}: Error finding container a9cf5f80c59f3bd4eb7ba274045754c5c79afd4a560043682c491d89c6a4ef79: Status 404 returned error can't find the container with id a9cf5f80c59f3bd4eb7ba274045754c5c79afd4a560043682c491d89c6a4ef79 Apr 22 18:46:37.840429 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.840404 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwbf9\" (UniqueName: \"kubernetes.io/projected/96308ab5-cbfb-459e-9e75-9d548626286b-kube-api-access-qwbf9\") pod \"node-ca-grdm6\" (UID: \"96308ab5-cbfb-459e-9e75-9d548626286b\") " pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:37.841307 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.841286 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcsjs\" (UniqueName: \"kubernetes.io/projected/88342289-d6b0-4f23-a7a4-e1b94386e991-kube-api-access-fcsjs\") pod \"aws-ebs-csi-driver-node-wdp8k\" (UID: \"88342289-d6b0-4f23-a7a4-e1b94386e991\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:37.841448 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.841431 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6cpk\" (UniqueName: \"kubernetes.io/projected/b77cfc29-6e5e-47f5-b607-aa33e5a172af-kube-api-access-v6cpk\") pod \"ovnkube-node-b9cf6\" (UID: \"b77cfc29-6e5e-47f5-b607-aa33e5a172af\") " pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:37.842107 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.842083 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbz6p\" (UniqueName: \"kubernetes.io/projected/4f604274-eb1a-4b2d-865b-59dbe9dc8461-kube-api-access-hbz6p\") pod \"tuned-26q7g\" (UID: \"4f604274-eb1a-4b2d-865b-59dbe9dc8461\") " pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:37.842727 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.842662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qvp\" (UniqueName: \"kubernetes.io/projected/47eec246-c244-4918-8600-48de7568588b-kube-api-access-f7qvp\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:37.843027 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.843005 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:37.860025 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.859990 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" event={"ID":"894b23123d23c1b0610c463e2e9162b9","Type":"ContainerStarted","Data":"a9cf5f80c59f3bd4eb7ba274045754c5c79afd4a560043682c491d89c6a4ef79"} Apr 22 18:46:37.860909 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.860891 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" event={"ID":"0aa354c0cbfbaf0036d5f596d6e0335c","Type":"ContainerStarted","Data":"bf6bb00eec21a71cd74e76a2d5a6047b8b734b1687edacc07f523b9a5c891bb0"} Apr 22 18:46:37.927761 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927708 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-multus-certs\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927761 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927738 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-k8s-cni-cncf-io\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927761 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927755 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-os-release\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twqn9\" (UniqueName: \"kubernetes.io/projected/add2dc4a-bd5c-417c-91dd-132eb3de7087-kube-api-access-twqn9\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927827 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3453260f-3618-4209-b141-058bfe076e0c-hosts-file\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927834 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-k8s-cni-cncf-io\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927850 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/697a8ed9-86fe-434b-9bc5-3296e657ff3d-host-slash\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927841 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-multus-certs\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927860 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-os-release\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-netns\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927895 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/697a8ed9-86fe-434b-9bc5-3296e657ff3d-host-slash\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.927917 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927903 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3453260f-3618-4209-b141-058bfe076e0c-hosts-file\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-run-netns\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927947 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-etc-kubernetes\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927964 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-cni-multus\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-kubelet\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.927996 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-conf-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928002 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-cni-multus\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928007 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-etc-kubernetes\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928017 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-system-cni-dir\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928047 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-system-cni-dir\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928048 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-kubelet\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928065 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-conf-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928064 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-cnibin\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928104 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-cnibin\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928101 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3453260f-3618-4209-b141-058bfe076e0c-tmp-dir\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928154 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928182 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/add2dc4a-bd5c-417c-91dd-132eb3de7087-cni-binary-copy\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.928323 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928209 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdwm\" (UniqueName: \"kubernetes.io/projected/697a8ed9-86fe-434b-9bc5-3296e657ff3d-kube-api-access-2cdwm\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928234 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b456252b-c126-48fd-ba56-9b92b64d07ce-konnectivity-ca\") pod \"konnectivity-agent-dngnr\" (UID: \"b456252b-c126-48fd-ba56-9b92b64d07ce\") " pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928354 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-cnibin\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928368 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3453260f-3618-4209-b141-058bfe076e0c-tmp-dir\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-socket-dir-parent\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928414 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928428 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-socket-dir-parent\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928459 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-cni-binary-copy\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928464 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-cnibin\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928484 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpg7f\" (UniqueName: \"kubernetes.io/projected/63aaeb86-a51a-4444-93df-19041d851cd6-kube-api-access-fpg7f\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-system-cni-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-os-release\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928583 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-cni-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928603 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-hostroot\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928626 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b456252b-c126-48fd-ba56-9b92b64d07ce-agent-certs\") pod \"konnectivity-agent-dngnr\" (UID: \"b456252b-c126-48fd-ba56-9b92b64d07ce\") " pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928640 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929085 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928653 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4xz\" (UniqueName: \"kubernetes.io/projected/3453260f-3618-4209-b141-058bfe076e0c-kube-api-access-vb4xz\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928676 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/697a8ed9-86fe-434b-9bc5-3296e657ff3d-iptables-alerter-script\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928703 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-daemon-config\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928711 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63aaeb86-a51a-4444-93df-19041d851cd6-os-release\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928731 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-cni-bin\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928769 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/add2dc4a-bd5c-417c-91dd-132eb3de7087-cni-binary-copy\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-host-var-lib-cni-bin\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928828 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928821 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b456252b-c126-48fd-ba56-9b92b64d07ce-konnectivity-ca\") pod \"konnectivity-agent-dngnr\" (UID: \"b456252b-c126-48fd-ba56-9b92b64d07ce\") " pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928863 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-system-cni-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928886 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-hostroot\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928938 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.928965 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-cni-dir\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.929013 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63aaeb86-a51a-4444-93df-19041d851cd6-cni-binary-copy\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.929251 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/697a8ed9-86fe-434b-9bc5-3296e657ff3d-iptables-alerter-script\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.929647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.929322 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/add2dc4a-bd5c-417c-91dd-132eb3de7087-multus-daemon-config\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.930970 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.930954 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b456252b-c126-48fd-ba56-9b92b64d07ce-agent-certs\") pod \"konnectivity-agent-dngnr\" (UID: \"b456252b-c126-48fd-ba56-9b92b64d07ce\") " pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:37.936767 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.936749 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqn9\" (UniqueName: \"kubernetes.io/projected/add2dc4a-bd5c-417c-91dd-132eb3de7087-kube-api-access-twqn9\") pod \"multus-dvr84\" (UID: \"add2dc4a-bd5c-417c-91dd-132eb3de7087\") " pod="openshift-multus/multus-dvr84" Apr 22 18:46:37.937103 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.937080 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdwm\" (UniqueName: \"kubernetes.io/projected/697a8ed9-86fe-434b-9bc5-3296e657ff3d-kube-api-access-2cdwm\") pod \"iptables-alerter-qh2jx\" (UID: \"697a8ed9-86fe-434b-9bc5-3296e657ff3d\") " pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:37.937962 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.937944 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4xz\" (UniqueName: \"kubernetes.io/projected/3453260f-3618-4209-b141-058bfe076e0c-kube-api-access-vb4xz\") pod \"node-resolver-sp2sj\" (UID: \"3453260f-3618-4209-b141-058bfe076e0c\") " pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:37.938238 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:37.938223 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpg7f\" (UniqueName: \"kubernetes.io/projected/63aaeb86-a51a-4444-93df-19041d851cd6-kube-api-access-fpg7f\") pod \"multus-additional-cni-plugins-dhp4b\" (UID: \"63aaeb86-a51a-4444-93df-19041d851cd6\") " pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:38.037249 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.037228 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:46:38.043166 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.043147 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77cfc29_6e5e_47f5_b607_aa33e5a172af.slice/crio-6f00f35c92aaf63af04e857065c307ac410707c7ad129fcf310cf717ec57a52e WatchSource:0}: Error finding container 6f00f35c92aaf63af04e857065c307ac410707c7ad129fcf310cf717ec57a52e: Status 404 returned error can't find the container with id 6f00f35c92aaf63af04e857065c307ac410707c7ad129fcf310cf717ec57a52e Apr 22 18:46:38.043903 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.043886 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" Apr 22 18:46:38.050281 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.050255 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88342289_d6b0_4f23_a7a4_e1b94386e991.slice/crio-827a2e0c426b14e3bcbf117013fa7dcbb72995805c4e6df1d93f4142c7a2a53f WatchSource:0}: Error finding container 827a2e0c426b14e3bcbf117013fa7dcbb72995805c4e6df1d93f4142c7a2a53f: Status 404 returned error can't find the container with id 827a2e0c426b14e3bcbf117013fa7dcbb72995805c4e6df1d93f4142c7a2a53f Apr 22 18:46:38.058261 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.058244 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-26q7g" Apr 22 18:46:38.063323 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.063301 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f604274_eb1a_4b2d_865b_59dbe9dc8461.slice/crio-f607ac7770e749af8d8ea9e714fd24acc332b59403c31ca5c937f2c8bf4f8116 WatchSource:0}: Error finding container f607ac7770e749af8d8ea9e714fd24acc332b59403c31ca5c937f2c8bf4f8116: Status 404 returned error can't find the container with id f607ac7770e749af8d8ea9e714fd24acc332b59403c31ca5c937f2c8bf4f8116 Apr 22 18:46:38.081180 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.081164 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grdm6" Apr 22 18:46:38.087917 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.087899 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96308ab5_cbfb_459e_9e75_9d548626286b.slice/crio-e0648af3411760e4f659d94a3e80841f9925f862c7cda7d0c3661b244c167292 WatchSource:0}: Error finding container e0648af3411760e4f659d94a3e80841f9925f862c7cda7d0c3661b244c167292: Status 404 returned error can't find the container with id e0648af3411760e4f659d94a3e80841f9925f862c7cda7d0c3661b244c167292 Apr 22 18:46:38.095346 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.095331 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qh2jx" Apr 22 18:46:38.101082 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.101063 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697a8ed9_86fe_434b_9bc5_3296e657ff3d.slice/crio-0c3612405ab4ebac13d88bcc9cfcf78a9974e461ff3eb7b3a21807deaf272f0f WatchSource:0}: Error finding container 0c3612405ab4ebac13d88bcc9cfcf78a9974e461ff3eb7b3a21807deaf272f0f: Status 404 returned error can't find the container with id 0c3612405ab4ebac13d88bcc9cfcf78a9974e461ff3eb7b3a21807deaf272f0f Apr 22 18:46:38.106040 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.106023 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:46:38.112315 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.112297 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb456252b_c126_48fd_ba56_9b92b64d07ce.slice/crio-b6c9b596297ee11fc780e459c7be130d329335a5e895a8927856634931ad9546 WatchSource:0}: Error finding container b6c9b596297ee11fc780e459c7be130d329335a5e895a8927856634931ad9546: Status 404 returned error can't find the container with id b6c9b596297ee11fc780e459c7be130d329335a5e895a8927856634931ad9546 Apr 22 18:46:38.118034 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.118020 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sp2sj" Apr 22 18:46:38.124212 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.124190 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3453260f_3618_4209_b141_058bfe076e0c.slice/crio-797f67798bec7fd15c329b2201f318cfb069b6c83ffb0d92ebeebe3ea4146e0c WatchSource:0}: Error finding container 797f67798bec7fd15c329b2201f318cfb069b6c83ffb0d92ebeebe3ea4146e0c: Status 404 returned error can't find the container with id 797f67798bec7fd15c329b2201f318cfb069b6c83ffb0d92ebeebe3ea4146e0c Apr 22 18:46:38.138480 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.138465 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" Apr 22 18:46:38.143438 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.143417 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63aaeb86_a51a_4444_93df_19041d851cd6.slice/crio-f2b93e4f24d55d18736c717fa8d47bbfea89ae0a5995903d1714bf4733cb49d2 WatchSource:0}: Error finding container f2b93e4f24d55d18736c717fa8d47bbfea89ae0a5995903d1714bf4733cb49d2: Status 404 returned error can't find the container with id f2b93e4f24d55d18736c717fa8d47bbfea89ae0a5995903d1714bf4733cb49d2 Apr 22 18:46:38.143929 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.143915 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dvr84" Apr 22 18:46:38.149135 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:46:38.149114 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadd2dc4a_bd5c_417c_91dd_132eb3de7087.slice/crio-538a01d491876acd070d6f68c4e5927332b928d656d6b5984b426b5a9f96a9e7 WatchSource:0}: Error finding container 538a01d491876acd070d6f68c4e5927332b928d656d6b5984b426b5a9f96a9e7: Status 404 returned error can't find the container with id 538a01d491876acd070d6f68c4e5927332b928d656d6b5984b426b5a9f96a9e7 Apr 22 18:46:38.331509 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.331437 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:38.331636 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:38.331547 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.331636 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:38.331598 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.331581091 +0000 UTC m=+3.054340625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.432027 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.431996 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:38.432200 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:38.432183 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:38.432290 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:38.432207 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:38.432290 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:38.432219 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.432290 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:38.432272 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.432252754 +0000 UTC m=+3.155012290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.597809 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.597717 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:38.761501 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.761424 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:37 +0000 UTC" deadline="2027-11-14 16:15:39.514240656 +0000 UTC" Apr 22 18:46:38.761501 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.761457 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13701h29m0.752786936s" Apr 22 18:46:38.840841 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.840647 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:38.865602 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.865539 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:38.889337 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.889281 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"6f00f35c92aaf63af04e857065c307ac410707c7ad129fcf310cf717ec57a52e"} Apr 22 18:46:38.898602 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.898542 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dvr84" event={"ID":"add2dc4a-bd5c-417c-91dd-132eb3de7087","Type":"ContainerStarted","Data":"538a01d491876acd070d6f68c4e5927332b928d656d6b5984b426b5a9f96a9e7"} Apr 22 18:46:38.903524 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.903478 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sp2sj" event={"ID":"3453260f-3618-4209-b141-058bfe076e0c","Type":"ContainerStarted","Data":"797f67798bec7fd15c329b2201f318cfb069b6c83ffb0d92ebeebe3ea4146e0c"} Apr 22 18:46:38.932120 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.932093 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dngnr" event={"ID":"b456252b-c126-48fd-ba56-9b92b64d07ce","Type":"ContainerStarted","Data":"b6c9b596297ee11fc780e459c7be130d329335a5e895a8927856634931ad9546"} Apr 22 18:46:38.939222 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.939190 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qh2jx" event={"ID":"697a8ed9-86fe-434b-9bc5-3296e657ff3d","Type":"ContainerStarted","Data":"0c3612405ab4ebac13d88bcc9cfcf78a9974e461ff3eb7b3a21807deaf272f0f"} Apr 22 18:46:38.948455 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.948427 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerStarted","Data":"f2b93e4f24d55d18736c717fa8d47bbfea89ae0a5995903d1714bf4733cb49d2"} Apr 22 18:46:38.957535 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.957513 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grdm6" event={"ID":"96308ab5-cbfb-459e-9e75-9d548626286b","Type":"ContainerStarted","Data":"e0648af3411760e4f659d94a3e80841f9925f862c7cda7d0c3661b244c167292"} Apr 22 18:46:38.970753 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.970724 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-26q7g" event={"ID":"4f604274-eb1a-4b2d-865b-59dbe9dc8461","Type":"ContainerStarted","Data":"f607ac7770e749af8d8ea9e714fd24acc332b59403c31ca5c937f2c8bf4f8116"} Apr 22 18:46:38.995713 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:38.995687 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" event={"ID":"88342289-d6b0-4f23-a7a4-e1b94386e991","Type":"ContainerStarted","Data":"827a2e0c426b14e3bcbf117013fa7dcbb72995805c4e6df1d93f4142c7a2a53f"} Apr 22 18:46:39.338259 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:39.338178 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:39.338423 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.338321 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:39.338423 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.338378 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:41.338360704 +0000 UTC m=+5.061120253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:39.439135 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:39.439039 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:39.439327 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.439230 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:39.439327 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.439247 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:39.439327 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.439260 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:39.439327 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.439313 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:41.439296515 +0000 UTC m=+5.162056050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:39.762369 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:39.762281 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:37 +0000 UTC" deadline="2028-01-18 02:55:11.78218294 +0000 UTC" Apr 22 18:46:39.762369 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:39.762321 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15248h8m32.019865255s" Apr 22 18:46:39.857721 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:39.857693 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:39.857899 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:39.857736 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:39.857899 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.857844 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:39.858142 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:39.858116 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:41.356824 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:41.355030 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:41.356824 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.355175 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:41.356824 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.355244 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.355222993 +0000 UTC m=+9.077982527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:41.455617 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:41.455518 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:41.455839 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.455642 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:41.455839 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.455665 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:41.455839 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.455678 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:41.455839 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.455736 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.455716557 +0000 UTC m=+9.178476107 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:41.859539 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:41.858667 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:41.859539 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.858803 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:41.859539 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:41.859183 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:41.859539 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:41.859288 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:43.858536 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:43.857889 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:43.858536 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:43.858005 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:43.858536 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:43.858401 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:43.858536 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:43.858487 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:44.466995 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.466961 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wmlxm"] Apr 22 18:46:44.471244 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.471219 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.471379 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:44.471291 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:44.581409 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.581209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ae2c297-7264-408e-ba35-12894de1c143-dbus\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.581409 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.581260 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.581409 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.581303 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ae2c297-7264-408e-ba35-12894de1c143-kubelet-config\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.681901 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.681866 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ae2c297-7264-408e-ba35-12894de1c143-dbus\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.681901 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.681912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.682134 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.681949 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ae2c297-7264-408e-ba35-12894de1c143-kubelet-config\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.682134 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.682039 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4ae2c297-7264-408e-ba35-12894de1c143-kubelet-config\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.682230 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:44.682182 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4ae2c297-7264-408e-ba35-12894de1c143-dbus\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:44.682310 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:44.682294 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:44.682367 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:44.682357 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret podName:4ae2c297-7264-408e-ba35-12894de1c143 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.182341006 +0000 UTC m=+8.905100541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret") pod "global-pull-secret-syncer-wmlxm" (UID: "4ae2c297-7264-408e-ba35-12894de1c143") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:45.186960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:45.186923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:45.187423 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.187123 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:45.187423 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.187181 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret podName:4ae2c297-7264-408e-ba35-12894de1c143 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.187162801 +0000 UTC m=+9.909922335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret") pod "global-pull-secret-syncer-wmlxm" (UID: "4ae2c297-7264-408e-ba35-12894de1c143") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:45.388415 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:45.388369 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:45.388667 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.388642 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.388818 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.388709 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.388689744 +0000 UTC m=+17.111449291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.489951 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:45.489364 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:45.489951 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.489553 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:45.489951 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.489569 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:45.489951 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.489577 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.489951 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.489639 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.489619733 +0000 UTC m=+17.212379291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.858187 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:45.858031 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:45.858187 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:45.858080 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:45.858187 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:45.858036 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:45.858187 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.858176 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:45.858616 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.858587 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:45.858713 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:45.858678 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:46.195939 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:46.195873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:46.196306 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:46.196051 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:46.196306 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:46.196098 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret podName:4ae2c297-7264-408e-ba35-12894de1c143 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:48.196085605 +0000 UTC m=+11.918845139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret") pod "global-pull-secret-syncer-wmlxm" (UID: "4ae2c297-7264-408e-ba35-12894de1c143") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:47.858558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:47.858529 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:47.858987 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:47.858529 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:47.858987 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:47.858646 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:47.858987 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:47.858733 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:47.858987 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:47.858542 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:47.858987 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:47.858858 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:48.212166 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:48.212129 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:48.212302 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:48.212282 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:48.212402 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:48.212361 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret podName:4ae2c297-7264-408e-ba35-12894de1c143 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.212340924 +0000 UTC m=+15.935100461 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret") pod "global-pull-secret-syncer-wmlxm" (UID: "4ae2c297-7264-408e-ba35-12894de1c143") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:49.857980 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:49.857946 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:49.858404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:49.857946 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:49.858404 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:49.858083 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:49.858404 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:49.857946 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:49.858404 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:49.858148 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:49.858404 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:49.858227 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:51.858729 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:51.858653 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:51.858729 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:51.858683 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:51.859177 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:51.858775 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:51.859177 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:51.858830 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:51.859177 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:51.858927 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:51.859177 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:51.859025 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:52.245702 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:52.245675 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:52.245891 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:52.245850 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:52.245958 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:52.245918 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret podName:4ae2c297-7264-408e-ba35-12894de1c143 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:00.245904501 +0000 UTC m=+23.968664037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret") pod "global-pull-secret-syncer-wmlxm" (UID: "4ae2c297-7264-408e-ba35-12894de1c143") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:53.453908 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:53.453862 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:53.454369 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.454010 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:53.454369 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.454072 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.454055415 +0000 UTC m=+33.176814952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:53.554736 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:53.554700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:53.554935 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.554854 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:53.554935 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.554876 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:53.554935 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.554888 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:53.555092 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.554949 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.554931275 +0000 UTC m=+33.277690809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:53.858579 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:53.858502 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:53.858713 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:53.858502 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:53.858713 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.858618 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:53.858713 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:53.858508 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:53.858713 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.858698 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:53.858892 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:53.858760 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:55.858229 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:55.857736 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:55.858229 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:55.857882 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:55.858229 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:55.857890 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:55.858229 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:55.857905 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:55.858229 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:55.857982 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:55.858229 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:55.858051 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:57.032339 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.032173 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" event={"ID":"0aa354c0cbfbaf0036d5f596d6e0335c","Type":"ContainerStarted","Data":"475a18000aa96dd0efe5ed9ee23958316e57607607a8cd0bab9d71ed1e839152"} Apr 22 18:46:57.033706 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.033682 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-26q7g" event={"ID":"4f604274-eb1a-4b2d-865b-59dbe9dc8461","Type":"ContainerStarted","Data":"ccfca3b943f6dd4fe0cd1af359e8f2e91e0b0324f9ecf7d82ddd0113255b9564"} Apr 22 18:46:57.035355 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.035330 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"4b0a84e3046f7415180a62ef90b8d30cd54ff73c3edf649b3c2baa28be981273"} Apr 22 18:46:57.036825 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.036804 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dvr84" event={"ID":"add2dc4a-bd5c-417c-91dd-132eb3de7087","Type":"ContainerStarted","Data":"a1583dc6c8295bbee81fcb581b9d21c8440a219b7bf28235009e7a7b743b9856"} Apr 22 18:46:57.047380 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.047302 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-249.ec2.internal" podStartSLOduration=21.047285423 podStartE2EDuration="21.047285423s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:57.046510455 +0000 UTC m=+20.769270012" watchObservedRunningTime="2026-04-22 18:46:57.047285423 +0000 UTC m=+20.770044980" Apr 22 18:46:57.066319 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.065950 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-26q7g" podStartSLOduration=2.944764198 podStartE2EDuration="21.065934909s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.064645022 +0000 UTC m=+1.787404556" lastFinishedPulling="2026-04-22 18:46:56.185815726 +0000 UTC m=+19.908575267" observedRunningTime="2026-04-22 18:46:57.065322995 +0000 UTC m=+20.788082552" watchObservedRunningTime="2026-04-22 18:46:57.065934909 +0000 UTC m=+20.788694465" Apr 22 18:46:57.083835 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.083338 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dvr84" podStartSLOduration=2.748796252 podStartE2EDuration="21.083321469s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.150482021 +0000 UTC m=+1.873241555" lastFinishedPulling="2026-04-22 18:46:56.48500722 +0000 UTC m=+20.207766772" observedRunningTime="2026-04-22 18:46:57.083256884 +0000 UTC m=+20.806016439" watchObservedRunningTime="2026-04-22 18:46:57.083321469 +0000 UTC m=+20.806081029" Apr 22 18:46:57.858391 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.858165 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:57.858483 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.858167 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:57.858483 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:57.858408 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:57.858483 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:57.858473 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:46:57.858483 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:57.858165 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:57.858618 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:57.858555 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:58.040351 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.040322 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sp2sj" event={"ID":"3453260f-3618-4209-b141-058bfe076e0c","Type":"ContainerStarted","Data":"404e3c844d8a9a648bab6141b49d2f1e33436bd245c759dfbc4e107bf5a3c6bd"} Apr 22 18:46:58.041805 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.041755 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dngnr" event={"ID":"b456252b-c126-48fd-ba56-9b92b64d07ce","Type":"ContainerStarted","Data":"314f2292b837eb4b4c72a4527db0da16c3c3382942e05d6b17fe65982a80e6c5"} Apr 22 18:46:58.043057 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.043035 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qh2jx" event={"ID":"697a8ed9-86fe-434b-9bc5-3296e657ff3d","Type":"ContainerStarted","Data":"c661f01ad1b91545b34b497f9cbae4ce595bfc8d50dc57c3729abb8cd42735ba"} Apr 22 18:46:58.044505 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.044483 2581 generic.go:358] "Generic (PLEG): container finished" podID="63aaeb86-a51a-4444-93df-19041d851cd6" containerID="f779e02f71c64e6c46015ef75d4758fcef4a4fa22e00d6fcd9375cf660dd7746" exitCode=0 Apr 22 18:46:58.044729 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.044708 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerDied","Data":"f779e02f71c64e6c46015ef75d4758fcef4a4fa22e00d6fcd9375cf660dd7746"} Apr 22 18:46:58.046135 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.046031 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grdm6" event={"ID":"96308ab5-cbfb-459e-9e75-9d548626286b","Type":"ContainerStarted","Data":"dbafda979b7882a4561f5c543fcbb5118bfc268a797a3e021744b90fbc64c069"} Apr 22 18:46:58.047663 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.047638 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" event={"ID":"88342289-d6b0-4f23-a7a4-e1b94386e991","Type":"ContainerStarted","Data":"be11b0fb5c21d2ee13d15b706719786b172f716bf86961ddb2bbfe8efb64beda"} Apr 22 18:46:58.050549 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.050527 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"f5d04465a43a26206a88eb769cb64fef2b3bf92367b5b1f3fc8af49df6715c74"} Apr 22 18:46:58.050549 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.050559 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"07aee94ff8f052a8d197fadf3b4931a50a351a1cb053cd002e4a10b6218ca53e"} Apr 22 18:46:58.050686 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.050570 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"9238bafe8cdc0eebe81715cb11399ad8bbc4c304148a821655c5db03ace342a9"} Apr 22 18:46:58.050686 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.050578 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"aa211d0e95d9eec8b732b338ecd77fc7f0fe7210bd98a000e3de7192a1e7afd2"} Apr 22 18:46:58.050686 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.050587 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"1c3735167786aefd23288f49516d70c5994a313bd7059f5a3e3a90a02745e8d7"} Apr 22 18:46:58.052020 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.052000 2581 generic.go:358] "Generic (PLEG): container finished" podID="894b23123d23c1b0610c463e2e9162b9" containerID="84a5582d389b8691cadfe6def775f3b549049fe55008323b1d8d27f9e6418eef" exitCode=0 Apr 22 18:46:58.052123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.052100 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" event={"ID":"894b23123d23c1b0610c463e2e9162b9","Type":"ContainerDied","Data":"84a5582d389b8691cadfe6def775f3b549049fe55008323b1d8d27f9e6418eef"} Apr 22 18:46:58.072914 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.072864 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sp2sj" podStartSLOduration=4.015442667 podStartE2EDuration="22.072850437s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.125504804 +0000 UTC m=+1.848264338" lastFinishedPulling="2026-04-22 18:46:56.182912574 +0000 UTC m=+19.905672108" observedRunningTime="2026-04-22 18:46:58.056115085 +0000 UTC m=+21.778874642" watchObservedRunningTime="2026-04-22 18:46:58.072850437 +0000 UTC m=+21.795609995" Apr 22 18:46:58.086704 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.086661 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-grdm6" podStartSLOduration=3.992727264 podStartE2EDuration="22.086649547s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.08898934 +0000 UTC m=+1.811748877" lastFinishedPulling="2026-04-22 18:46:56.182911614 +0000 UTC m=+19.905671160" observedRunningTime="2026-04-22 18:46:58.086140265 +0000 UTC m=+21.808899821" watchObservedRunningTime="2026-04-22 18:46:58.086649547 +0000 UTC m=+21.809409102" Apr 22 18:46:58.102446 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.102400 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qh2jx" podStartSLOduration=4.022327282 podStartE2EDuration="22.102385665s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.102639923 +0000 UTC m=+1.825399457" lastFinishedPulling="2026-04-22 18:46:56.182698289 +0000 UTC m=+19.905457840" observedRunningTime="2026-04-22 18:46:58.101954847 +0000 UTC m=+21.824714416" watchObservedRunningTime="2026-04-22 18:46:58.102385665 +0000 UTC m=+21.825145222" Apr 22 18:46:58.140002 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.139955 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dngnr" podStartSLOduration=4.070607351 podStartE2EDuration="22.139941952s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.113621321 +0000 UTC m=+1.836380855" lastFinishedPulling="2026-04-22 18:46:56.182955921 +0000 UTC m=+19.905715456" observedRunningTime="2026-04-22 18:46:58.139881574 +0000 UTC m=+21.862641141" watchObservedRunningTime="2026-04-22 18:46:58.139941952 +0000 UTC m=+21.862701487" Apr 22 18:46:58.452410 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.452375 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:58.785226 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.785075 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:58.452398699Z","UUID":"01617391-2579-4f26-9b98-fd4801274f64","Handler":null,"Name":"","Endpoint":""} Apr 22 18:46:58.788036 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.788013 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:46:58.788036 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:58.788042 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:46:59.055980 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:59.055477 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" event={"ID":"894b23123d23c1b0610c463e2e9162b9","Type":"ContainerStarted","Data":"8c8e664d521ee8bb358a01e129355aedb8e0c8784b8767015f92a00dfdc00b44"} Apr 22 18:46:59.057672 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:59.057639 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" event={"ID":"88342289-d6b0-4f23-a7a4-e1b94386e991","Type":"ContainerStarted","Data":"3ffd4a07835a69e710b3c1edb1b05ee1a8fc80ab491a9715fd28f52aff518d19"} Apr 22 18:46:59.072493 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:59.072445 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-249.ec2.internal" podStartSLOduration=23.072432751 podStartE2EDuration="23.072432751s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:59.072016936 +0000 UTC m=+22.794776492" watchObservedRunningTime="2026-04-22 18:46:59.072432751 +0000 UTC m=+22.795192306" Apr 22 18:46:59.857886 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:59.857844 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:46:59.858057 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:59.857899 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:46:59.858057 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:46:59.857941 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:46:59.858057 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:59.858038 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:46:59.858206 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:59.858102 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:46:59.858206 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:46:59.858173 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:47:00.063838 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:00.063804 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"b1f7b975670945a77d61128abbffa4357fe96f2500dbcf527323abf229e6b092"} Apr 22 18:47:00.310245 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:00.310218 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:00.310407 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:00.310391 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:00.310473 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:00.310457 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret podName:4ae2c297-7264-408e-ba35-12894de1c143 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:16.310438232 +0000 UTC m=+40.033197766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret") pod "global-pull-secret-syncer-wmlxm" (UID: "4ae2c297-7264-408e-ba35-12894de1c143") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:00.815614 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:00.815578 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:47:00.816234 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:00.816208 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:47:01.067480 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:01.067391 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" event={"ID":"88342289-d6b0-4f23-a7a4-e1b94386e991","Type":"ContainerStarted","Data":"778fab183b74429fdb2678ce639ec7c652ce8888956a10cd00bbea637187c9f4"} Apr 22 18:47:01.086838 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:01.086771 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdp8k" podStartSLOduration=2.912431822 podStartE2EDuration="25.086758126s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.051871459 +0000 UTC m=+1.774630993" lastFinishedPulling="2026-04-22 18:47:00.226197762 +0000 UTC m=+23.948957297" observedRunningTime="2026-04-22 18:47:01.086440237 +0000 UTC m=+24.809199792" watchObservedRunningTime="2026-04-22 18:47:01.086758126 +0000 UTC m=+24.809517680" Apr 22 18:47:01.858318 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:01.858238 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:01.858505 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:01.858350 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:01.858505 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:01.858383 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:01.858505 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:01.858444 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:47:01.858505 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:01.858345 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:47:01.858718 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:01.858524 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:47:02.429828 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:02.429679 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:47:02.430445 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:02.429925 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:02.430445 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:02.430246 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dngnr" Apr 22 18:47:03.072149 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.072099 2581 generic.go:358] "Generic (PLEG): container finished" podID="63aaeb86-a51a-4444-93df-19041d851cd6" containerID="2346ce315b8f58fc0eef560f0609a498c40f3492ff2d29de4d18849dd46ba4c1" exitCode=0 Apr 22 18:47:03.072335 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.072187 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerDied","Data":"2346ce315b8f58fc0eef560f0609a498c40f3492ff2d29de4d18849dd46ba4c1"} Apr 22 18:47:03.075692 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.075667 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" event={"ID":"b77cfc29-6e5e-47f5-b607-aa33e5a172af","Type":"ContainerStarted","Data":"91ffd9f50b61b46440d50990a36dcac48311e356b2577da6e82e63b4e7a8aadf"} Apr 22 18:47:03.076023 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.076002 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:47:03.076110 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.076032 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:47:03.076110 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.076041 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:47:03.091262 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.091243 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:47:03.091354 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.091300 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:47:03.131201 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.131165 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" podStartSLOduration=8.230512392 podStartE2EDuration="27.131154305s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.044929585 +0000 UTC m=+1.767689129" lastFinishedPulling="2026-04-22 18:46:56.945571493 +0000 UTC m=+20.668331042" observedRunningTime="2026-04-22 18:47:03.130939055 +0000 UTC m=+26.853698611" watchObservedRunningTime="2026-04-22 18:47:03.131154305 +0000 UTC m=+26.853913861" Apr 22 18:47:03.858174 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.858140 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:03.858732 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:03.858258 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:47:03.858732 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.858147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:03.858732 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:03.858348 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:47:03.858732 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:03.858140 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:03.858732 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:03.858426 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:47:04.099470 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:04.099439 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wmlxm"] Apr 22 18:47:04.099562 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:04.099524 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:04.099628 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:04.099606 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:47:04.103044 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:04.102769 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b4wkh"] Apr 22 18:47:04.103044 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:04.102892 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:04.103044 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:04.103010 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:47:04.103468 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:04.103449 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z9kwg"] Apr 22 18:47:04.103543 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:04.103527 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:04.103622 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:04.103606 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:47:05.083470 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:05.083433 2581 generic.go:358] "Generic (PLEG): container finished" podID="63aaeb86-a51a-4444-93df-19041d851cd6" containerID="2e1b967aa220ba5ffffa90e3daeecf867d12a3ce4c87de5af6545536c0af125b" exitCode=0 Apr 22 18:47:05.083969 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:05.083516 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerDied","Data":"2e1b967aa220ba5ffffa90e3daeecf867d12a3ce4c87de5af6545536c0af125b"} Apr 22 18:47:05.857754 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:05.857731 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:05.857754 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:05.857751 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:05.857907 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:05.857731 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:05.857907 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:05.857837 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:47:05.857980 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:05.857947 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:47:05.858023 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:05.857999 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:47:06.088108 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:06.087902 2581 generic.go:358] "Generic (PLEG): container finished" podID="63aaeb86-a51a-4444-93df-19041d851cd6" containerID="2735f86c3a408924a37c1b5a61a0f366b65d4cbac28d1d16bb5669281b3fc348" exitCode=0 Apr 22 18:47:06.088108 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:06.087984 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerDied","Data":"2735f86c3a408924a37c1b5a61a0f366b65d4cbac28d1d16bb5669281b3fc348"} Apr 22 18:47:07.858387 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:07.858305 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:07.858890 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:07.858427 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4wkh" podUID="2ee4ece0-4e59-4b13-a6f7-140e212f2fd7" Apr 22 18:47:07.858890 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:07.858709 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:07.858890 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:07.858777 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wmlxm" podUID="4ae2c297-7264-408e-ba35-12894de1c143" Apr 22 18:47:07.858890 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:07.858828 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:07.859065 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:07.858914 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:47:09.486863 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.486828 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:09.487376 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.487357 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:09.487433 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.487424 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:41.487406534 +0000 UTC m=+65.210166072 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:09.587485 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.587454 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:09.587637 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.587617 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:09.587685 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.587643 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:09.587685 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.587654 2581 projected.go:194] Error preparing data for projected volume kube-api-access-r9j84 for pod openshift-network-diagnostics/network-check-target-b4wkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:09.587768 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.587703 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84 podName:2ee4ece0-4e59-4b13-a6f7-140e212f2fd7 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:41.587689544 +0000 UTC m=+65.310449082 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9j84" (UniqueName: "kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84") pod "network-check-target-b4wkh" (UID: "2ee4ece0-4e59-4b13-a6f7-140e212f2fd7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:09.604297 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.604276 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-249.ec2.internal" event="NodeReady" Apr 22 18:47:09.606564 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.604683 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:09.656926 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.656849 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6vxjs"] Apr 22 18:47:09.681863 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.681839 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qbsd9"] Apr 22 18:47:09.682019 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.681997 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.684599 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.684575 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:09.684767 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.684725 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w2ls6\"" Apr 22 18:47:09.684767 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.684737 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:09.697135 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.697112 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qbsd9"] Apr 22 18:47:09.697135 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.697129 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:09.697293 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.697151 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6vxjs"] Apr 22 18:47:09.699993 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.699972 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q94s6\"" Apr 22 18:47:09.700088 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.700045 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:09.700088 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.700077 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:09.700198 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.700174 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:09.788827 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.788770 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-config-volume\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.788979 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.788862 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-tmp-dir\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.788979 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.788897 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.788979 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.788917 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd94m\" (UniqueName: \"kubernetes.io/projected/a5c3671b-f180-45d3-aad6-34b06441fbac-kube-api-access-sd94m\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:09.789129 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.789011 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvlch\" (UniqueName: \"kubernetes.io/projected/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-kube-api-access-wvlch\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.789129 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.789052 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:09.858627 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.858599 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:09.858763 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.858659 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:09.858861 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.858832 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:09.861649 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.861626 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:09.861760 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.861712 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c77rp\"" Apr 22 18:47:09.861836 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.861757 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:47:09.861836 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.861760 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:09.861937 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.861838 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:09.861994 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.861955 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qpsz\"" Apr 22 18:47:09.889485 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:09.889586 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-config-volume\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.889586 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889547 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-tmp-dir\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.889586 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889570 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.889728 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd94m\" (UniqueName: \"kubernetes.io/projected/a5c3671b-f180-45d3-aad6-34b06441fbac-kube-api-access-sd94m\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:09.889728 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.889616 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:09.889728 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.889678 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.389658304 +0000 UTC m=+34.112417856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:09.889728 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.889701 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:09.889965 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:09.889764 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.38974678 +0000 UTC m=+34.112506314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:09.889965 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvlch\" (UniqueName: \"kubernetes.io/projected/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-kube-api-access-wvlch\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.889965 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.889924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-tmp-dir\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.890252 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.890231 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-config-volume\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.900858 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.900837 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvlch\" (UniqueName: \"kubernetes.io/projected/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-kube-api-access-wvlch\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:09.900958 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:09.900936 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd94m\" (UniqueName: \"kubernetes.io/projected/a5c3671b-f180-45d3-aad6-34b06441fbac-kube-api-access-sd94m\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:10.392872 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:10.392818 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:10.393060 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:10.392912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:10.393060 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:10.392972 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:10.393060 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:10.393006 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:10.393060 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:10.393041 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.393023092 +0000 UTC m=+35.115782647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:10.393060 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:10.393058 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.393050585 +0000 UTC m=+35.115810118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:11.401538 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:11.401503 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:11.402056 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:11.401578 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:11.402056 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:11.401660 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:11.402056 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:11.401709 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:11.402056 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:11.401735 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:13.401714317 +0000 UTC m=+37.124473861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:11.402056 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:11.401772 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:47:13.401754972 +0000 UTC m=+37.124514515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:13.104363 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:13.104208 2581 generic.go:358] "Generic (PLEG): container finished" podID="63aaeb86-a51a-4444-93df-19041d851cd6" containerID="fc2a8081c0b4c94dfbc598cf92b8dc11f14c286fe4610bcc5715501249a339a2" exitCode=0 Apr 22 18:47:13.104363 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:13.104282 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerDied","Data":"fc2a8081c0b4c94dfbc598cf92b8dc11f14c286fe4610bcc5715501249a339a2"} Apr 22 18:47:13.414563 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:13.414472 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:13.414563 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:13.414546 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:13.414726 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:13.414628 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:13.414726 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:13.414658 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:13.414726 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:13.414694 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.41467529 +0000 UTC m=+41.137434823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:13.414726 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:13.414711 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.41470405 +0000 UTC m=+41.137463583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:14.108386 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:14.108359 2581 generic.go:358] "Generic (PLEG): container finished" podID="63aaeb86-a51a-4444-93df-19041d851cd6" containerID="3cef6a6dfa6a44666d77e79bfdb2f1ea8164e5d0ab3cf8ab514aa4dda2d58366" exitCode=0 Apr 22 18:47:14.108718 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:14.108418 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerDied","Data":"3cef6a6dfa6a44666d77e79bfdb2f1ea8164e5d0ab3cf8ab514aa4dda2d58366"} Apr 22 18:47:15.112764 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:15.112730 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" event={"ID":"63aaeb86-a51a-4444-93df-19041d851cd6","Type":"ContainerStarted","Data":"1e1763197ff8d23f18e8f13efc42e8db9ca5166d6d1a3d5eee7becf7d98eaddc"} Apr 22 18:47:15.138763 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:15.138723 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dhp4b" podStartSLOduration=5.237686038 podStartE2EDuration="39.138711259s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:38.144745894 +0000 UTC m=+1.867505428" lastFinishedPulling="2026-04-22 18:47:12.045771112 +0000 UTC m=+35.768530649" observedRunningTime="2026-04-22 18:47:15.138445058 +0000 UTC m=+38.861204614" watchObservedRunningTime="2026-04-22 18:47:15.138711259 +0000 UTC m=+38.861470815" Apr 22 18:47:16.334727 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:16.334692 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:16.338201 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:16.338171 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4ae2c297-7264-408e-ba35-12894de1c143-original-pull-secret\") pod \"global-pull-secret-syncer-wmlxm\" (UID: \"4ae2c297-7264-408e-ba35-12894de1c143\") " pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:16.484942 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:16.484914 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wmlxm" Apr 22 18:47:16.662732 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:16.662706 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wmlxm"] Apr 22 18:47:16.668931 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:47:16.668903 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae2c297_7264_408e_ba35_12894de1c143.slice/crio-f0d9528abd7b0c9d358dfadd63cbcb2f3988291c7a283f01b58a8f03826f6713 WatchSource:0}: Error finding container f0d9528abd7b0c9d358dfadd63cbcb2f3988291c7a283f01b58a8f03826f6713: Status 404 returned error can't find the container with id f0d9528abd7b0c9d358dfadd63cbcb2f3988291c7a283f01b58a8f03826f6713 Apr 22 18:47:17.116770 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:17.116737 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wmlxm" event={"ID":"4ae2c297-7264-408e-ba35-12894de1c143","Type":"ContainerStarted","Data":"f0d9528abd7b0c9d358dfadd63cbcb2f3988291c7a283f01b58a8f03826f6713"} Apr 22 18:47:17.443591 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:17.443559 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:17.444073 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:17.443624 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:17.444073 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:17.443726 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:17.444073 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:17.443723 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:17.444073 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:17.443803 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:25.443766297 +0000 UTC m=+49.166525836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:17.444073 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:17.443821 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:47:25.443813126 +0000 UTC m=+49.166572660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:21.125501 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:21.125462 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wmlxm" event={"ID":"4ae2c297-7264-408e-ba35-12894de1c143","Type":"ContainerStarted","Data":"6aecc92433ea1cc7a11ba3bb6fb37fd3d775b720f0d43cc7e7737d6ede5ae58d"} Apr 22 18:47:21.141873 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:21.141833 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wmlxm" podStartSLOduration=33.199688009 podStartE2EDuration="37.141819135s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:47:16.670523452 +0000 UTC m=+40.393283003" lastFinishedPulling="2026-04-22 18:47:20.612654577 +0000 UTC m=+44.335414129" observedRunningTime="2026-04-22 18:47:21.141040091 +0000 UTC m=+44.863799639" watchObservedRunningTime="2026-04-22 18:47:21.141819135 +0000 UTC m=+44.864578670" Apr 22 18:47:25.495534 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:25.495495 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:25.495991 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:25.495560 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:25.495991 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:25.495645 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:25.495991 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:25.495714 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:47:41.495696007 +0000 UTC m=+65.218455542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:25.495991 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:25.495650 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:25.495991 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:25.495806 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:41.495775018 +0000 UTC m=+65.218534552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:35.096035 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:35.096004 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b9cf6" Apr 22 18:47:41.497027 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.496981 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:47:41.497510 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.497043 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:47:41.497510 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.497089 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:47:41.497510 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:41.497137 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:41.497510 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:41.497195 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:41.497510 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:41.497201 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:13.497186391 +0000 UTC m=+97.219945925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:47:41.497510 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:41.497259 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:48:13.497245472 +0000 UTC m=+97.220005006 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:47:41.499920 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.499901 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:41.507860 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:41.507840 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:41.507966 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:47:41.507898 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:45.507880599 +0000 UTC m=+129.230640133 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : secret "metrics-daemon-secret" not found Apr 22 18:47:41.597798 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.597771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:41.600631 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.600615 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:41.610304 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.610288 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:41.635028 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.635002 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9j84\" (UniqueName: \"kubernetes.io/projected/2ee4ece0-4e59-4b13-a6f7-140e212f2fd7-kube-api-access-r9j84\") pod \"network-check-target-b4wkh\" (UID: \"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7\") " pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:41.687677 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.687653 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c77rp\"" Apr 22 18:47:41.695833 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.695809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:41.807390 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:41.807362 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b4wkh"] Apr 22 18:47:41.810487 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:47:41.810459 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee4ece0_4e59_4b13_a6f7_140e212f2fd7.slice/crio-792684c7ecc531d3b18ca2a886dff6235f2068b3aa88ce21387507979696bc7f WatchSource:0}: Error finding container 792684c7ecc531d3b18ca2a886dff6235f2068b3aa88ce21387507979696bc7f: Status 404 returned error can't find the container with id 792684c7ecc531d3b18ca2a886dff6235f2068b3aa88ce21387507979696bc7f Apr 22 18:47:42.168142 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:42.168067 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b4wkh" event={"ID":"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7","Type":"ContainerStarted","Data":"792684c7ecc531d3b18ca2a886dff6235f2068b3aa88ce21387507979696bc7f"} Apr 22 18:47:45.176571 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:45.176531 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b4wkh" event={"ID":"2ee4ece0-4e59-4b13-a6f7-140e212f2fd7","Type":"ContainerStarted","Data":"5e9bd09ea2df4fd96f30a49d94a0361e187c4365c792a855429f2afa735ef826"} Apr 22 18:47:45.177050 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:45.176697 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:47:45.193390 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:47:45.193348 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b4wkh" podStartSLOduration=66.460029721 podStartE2EDuration="1m9.193333705s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:47:41.812286955 +0000 UTC m=+65.535046490" lastFinishedPulling="2026-04-22 18:47:44.545590937 +0000 UTC m=+68.268350474" observedRunningTime="2026-04-22 18:47:45.192713063 +0000 UTC m=+68.915472618" watchObservedRunningTime="2026-04-22 18:47:45.193333705 +0000 UTC m=+68.916093261" Apr 22 18:48:13.503466 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:13.503303 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:48:13.503466 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:13.503379 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:48:13.503466 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:13.503470 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:13.504071 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:13.503470 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:13.504071 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:13.503550 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls podName:e4ec9d5d-c253-4d72-ba5a-a2af35c106d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:17.50352863 +0000 UTC m=+161.226288164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls") pod "dns-default-6vxjs" (UID: "e4ec9d5d-c253-4d72-ba5a-a2af35c106d2") : secret "dns-default-metrics-tls" not found Apr 22 18:48:13.504071 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:13.503566 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert podName:a5c3671b-f180-45d3-aad6-34b06441fbac nodeName:}" failed. No retries permitted until 2026-04-22 18:49:17.50355951 +0000 UTC m=+161.226319045 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert") pod "ingress-canary-qbsd9" (UID: "a5c3671b-f180-45d3-aad6-34b06441fbac") : secret "canary-serving-cert" not found Apr 22 18:48:16.181398 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:16.181368 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b4wkh" Apr 22 18:48:45.513284 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:45.513239 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:48:45.513911 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:45.513383 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:48:45.513911 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:45.513460 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs podName:47eec246-c244-4918-8600-48de7568588b nodeName:}" failed. No retries permitted until 2026-04-22 18:50:47.513440734 +0000 UTC m=+251.236200271 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs") pod "network-metrics-daemon-z9kwg" (UID: "47eec246-c244-4918-8600-48de7568588b") : secret "metrics-daemon-secret" not found Apr 22 18:48:57.146618 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.146582 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-578f969574-tk4dr"] Apr 22 18:48:57.149664 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.149641 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.154542 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.154512 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:48:57.154667 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.154516 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:48:57.154667 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.154523 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-mznkt\"" Apr 22 18:48:57.154667 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.154523 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:48:57.155413 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.155392 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.155413 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.155407 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.155579 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.155407 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:48:57.161136 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.161114 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-578f969574-tk4dr"] Apr 22 18:48:57.246869 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.246849 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qkfrb"] Apr 22 18:48:57.249819 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.249780 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd"] Apr 22 18:48:57.249973 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.249955 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.252684 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.252665 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.253428 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.253411 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:48:57.253672 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.253657 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:48:57.253930 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.253916 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.256044 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.256023 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.256128 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.256045 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.257211 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.256958 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pbsnd\"" Apr 22 18:48:57.257211 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.257025 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:48:57.257211 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.257089 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-f6287\"" Apr 22 18:48:57.260572 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.260544 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd"] Apr 22 18:48:57.261947 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.261925 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.262263 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.262248 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:48:57.263065 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.263044 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qkfrb"] Apr 22 18:48:57.287597 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.287572 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.287681 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.287605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-stats-auth\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.287681 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.287622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btww\" (UniqueName: \"kubernetes.io/projected/b44c3680-e1d1-4e14-b58a-8dccd8912f42-kube-api-access-9btww\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.287753 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.287694 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.287753 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.287733 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-default-certificate\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.347335 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.347316 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx"] Apr 22 18:48:57.350031 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.350009 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k"] Apr 22 18:48:57.350166 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.350151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" Apr 22 18:48:57.352511 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.352496 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.353015 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.352995 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.353107 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.353048 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.353107 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.353076 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-g4vkf\"" Apr 22 18:48:57.354753 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.354734 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:48:57.355180 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.355166 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.355305 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.355283 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.355398 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.355354 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-7kv5q\"" Apr 22 18:48:57.355621 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.355607 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:48:57.357999 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.357982 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k"] Apr 22 18:48:57.359041 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.359013 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx"] Apr 22 18:48:57.388118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388093 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vhh\" (UniqueName: \"kubernetes.io/projected/705539a2-839e-49d6-b593-4edbd2dce2aa-kube-api-access-t9vhh\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.388225 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388135 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.388225 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388160 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-stats-auth\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.388225 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388192 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9btww\" (UniqueName: \"kubernetes.io/projected/b44c3680-e1d1-4e14-b58a-8dccd8912f42-kube-api-access-9btww\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.388225 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388221 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705539a2-839e-49d6-b593-4edbd2dce2aa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.388448 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388247 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705539a2-839e-49d6-b593-4edbd2dce2aa-service-ca-bundle\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.388448 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.388272 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:57.888252117 +0000 UTC m=+141.611011654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:57.388550 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.388550 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388471 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/705539a2-839e-49d6-b593-4edbd2dce2aa-tmp\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.388550 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388502 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qdmd\" (UniqueName: \"kubernetes.io/projected/5e01ef15-d5cc-485f-9813-b674754792b7-kube-api-access-9qdmd\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.388550 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/705539a2-839e-49d6-b593-4edbd2dce2aa-snapshots\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.388716 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388555 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.388716 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388606 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/705539a2-839e-49d6-b593-4edbd2dce2aa-serving-cert\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.388716 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.388638 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:57.388716 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.388652 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-default-certificate\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.388716 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.388681 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:57.888665829 +0000 UTC m=+141.611425364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : secret "router-metrics-certs-default" not found Apr 22 18:48:57.390558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.390541 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-stats-auth\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.390897 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.390879 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-default-certificate\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.396760 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.396713 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btww\" (UniqueName: \"kubernetes.io/projected/b44c3680-e1d1-4e14-b58a-8dccd8912f42-kube-api-access-9btww\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.489615 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489591 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.489714 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489624 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/705539a2-839e-49d6-b593-4edbd2dce2aa-tmp\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.489714 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489651 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qdmd\" (UniqueName: \"kubernetes.io/projected/5e01ef15-d5cc-485f-9813-b674754792b7-kube-api-access-9qdmd\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.489714 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489691 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/705539a2-839e-49d6-b593-4edbd2dce2aa-snapshots\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.489900 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.489712 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:57.489900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489722 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xzf\" (UniqueName: \"kubernetes.io/projected/5e8b4e54-c935-40c6-be8a-d2c22c575aa3-kube-api-access-k6xzf\") pod \"volume-data-source-validator-7c6cbb6c87-2n2sx\" (UID: \"5e8b4e54-c935-40c6-be8a-d2c22c575aa3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" Apr 22 18:48:57.489900 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.489771 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls podName:5e01ef15-d5cc-485f-9813-b674754792b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:57.989755656 +0000 UTC m=+141.712515208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tjgcd" (UID: "5e01ef15-d5cc-485f-9813-b674754792b7") : secret "samples-operator-tls" not found Apr 22 18:48:57.489900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489838 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73aa7185-77ed-4fd2-ae5a-96192fefe723-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.489900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489878 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/705539a2-839e-49d6-b593-4edbd2dce2aa-serving-cert\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.490138 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489941 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vhh\" (UniqueName: \"kubernetes.io/projected/705539a2-839e-49d6-b593-4edbd2dce2aa-kube-api-access-t9vhh\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.490138 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73aa7185-77ed-4fd2-ae5a-96192fefe723-config\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.490138 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.489980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/705539a2-839e-49d6-b593-4edbd2dce2aa-tmp\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.490138 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.490019 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705539a2-839e-49d6-b593-4edbd2dce2aa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.490138 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.490045 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705539a2-839e-49d6-b593-4edbd2dce2aa-service-ca-bundle\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.490138 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.490071 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttljp\" (UniqueName: \"kubernetes.io/projected/73aa7185-77ed-4fd2-ae5a-96192fefe723-kube-api-access-ttljp\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.491239 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.491213 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705539a2-839e-49d6-b593-4edbd2dce2aa-service-ca-bundle\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.491575 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.491550 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705539a2-839e-49d6-b593-4edbd2dce2aa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.492615 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.492592 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/705539a2-839e-49d6-b593-4edbd2dce2aa-snapshots\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.494173 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.492870 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/705539a2-839e-49d6-b593-4edbd2dce2aa-serving-cert\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.499932 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.499911 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qdmd\" (UniqueName: \"kubernetes.io/projected/5e01ef15-d5cc-485f-9813-b674754792b7-kube-api-access-9qdmd\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.500029 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.500012 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vhh\" (UniqueName: \"kubernetes.io/projected/705539a2-839e-49d6-b593-4edbd2dce2aa-kube-api-access-t9vhh\") pod \"insights-operator-585dfdc468-qkfrb\" (UID: \"705539a2-839e-49d6-b593-4edbd2dce2aa\") " pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.560418 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.560399 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" Apr 22 18:48:57.591118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.591084 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xzf\" (UniqueName: \"kubernetes.io/projected/5e8b4e54-c935-40c6-be8a-d2c22c575aa3-kube-api-access-k6xzf\") pod \"volume-data-source-validator-7c6cbb6c87-2n2sx\" (UID: \"5e8b4e54-c935-40c6-be8a-d2c22c575aa3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" Apr 22 18:48:57.591242 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.591136 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73aa7185-77ed-4fd2-ae5a-96192fefe723-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.591242 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.591205 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73aa7185-77ed-4fd2-ae5a-96192fefe723-config\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.591433 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.591406 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttljp\" (UniqueName: \"kubernetes.io/projected/73aa7185-77ed-4fd2-ae5a-96192fefe723-kube-api-access-ttljp\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.591900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.591879 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73aa7185-77ed-4fd2-ae5a-96192fefe723-config\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.593276 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.593252 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73aa7185-77ed-4fd2-ae5a-96192fefe723-serving-cert\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.600239 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.600220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttljp\" (UniqueName: \"kubernetes.io/projected/73aa7185-77ed-4fd2-ae5a-96192fefe723-kube-api-access-ttljp\") pod \"service-ca-operator-d6fc45fc5-xgv2k\" (UID: \"73aa7185-77ed-4fd2-ae5a-96192fefe723\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.600533 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.600515 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xzf\" (UniqueName: \"kubernetes.io/projected/5e8b4e54-c935-40c6-be8a-d2c22c575aa3-kube-api-access-k6xzf\") pod \"volume-data-source-validator-7c6cbb6c87-2n2sx\" (UID: \"5e8b4e54-c935-40c6-be8a-d2c22c575aa3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" Apr 22 18:48:57.660502 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.660438 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" Apr 22 18:48:57.665169 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.665151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" Apr 22 18:48:57.672039 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.672020 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qkfrb"] Apr 22 18:48:57.675019 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:48:57.674992 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705539a2_839e_49d6_b593_4edbd2dce2aa.slice/crio-f668540a235c3eb3ae44e47a528f7152a87653dcdad676890a01ec19283ac8ba WatchSource:0}: Error finding container f668540a235c3eb3ae44e47a528f7152a87653dcdad676890a01ec19283ac8ba: Status 404 returned error can't find the container with id f668540a235c3eb3ae44e47a528f7152a87653dcdad676890a01ec19283ac8ba Apr 22 18:48:57.783837 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.783810 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k"] Apr 22 18:48:57.788034 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:48:57.788005 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73aa7185_77ed_4fd2_ae5a_96192fefe723.slice/crio-c284e59b2fbc891d65995aceeac4e5923ff3a6ac3b52eef991bbe1f68cd01063 WatchSource:0}: Error finding container c284e59b2fbc891d65995aceeac4e5923ff3a6ac3b52eef991bbe1f68cd01063: Status 404 returned error can't find the container with id c284e59b2fbc891d65995aceeac4e5923ff3a6ac3b52eef991bbe1f68cd01063 Apr 22 18:48:57.798147 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.798127 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx"] Apr 22 18:48:57.800450 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:48:57.800428 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e8b4e54_c935_40c6_be8a_d2c22c575aa3.slice/crio-4f2577a80a8c9b014488be1bd0583a9ed5eeb8f630428ebfdfeb4f6a6355fbee WatchSource:0}: Error finding container 4f2577a80a8c9b014488be1bd0583a9ed5eeb8f630428ebfdfeb4f6a6355fbee: Status 404 returned error can't find the container with id 4f2577a80a8c9b014488be1bd0583a9ed5eeb8f630428ebfdfeb4f6a6355fbee Apr 22 18:48:57.893101 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.893073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.893189 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.893155 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:57.893241 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.893223 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:57.893303 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.893294 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:58.893276552 +0000 UTC m=+142.616036088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : secret "router-metrics-certs-default" not found Apr 22 18:48:57.893346 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.893311 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:58.89330446 +0000 UTC m=+142.616063994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:57.993834 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:57.993811 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:57.993980 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.993961 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:57.994052 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:57.994029 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls podName:5e01ef15-d5cc-485f-9813-b674754792b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:58.994009957 +0000 UTC m=+142.716769494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tjgcd" (UID: "5e01ef15-d5cc-485f-9813-b674754792b7") : secret "samples-operator-tls" not found Apr 22 18:48:58.317832 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:58.317737 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" event={"ID":"5e8b4e54-c935-40c6-be8a-d2c22c575aa3","Type":"ContainerStarted","Data":"4f2577a80a8c9b014488be1bd0583a9ed5eeb8f630428ebfdfeb4f6a6355fbee"} Apr 22 18:48:58.318826 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:58.318781 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" event={"ID":"705539a2-839e-49d6-b593-4edbd2dce2aa","Type":"ContainerStarted","Data":"f668540a235c3eb3ae44e47a528f7152a87653dcdad676890a01ec19283ac8ba"} Apr 22 18:48:58.319820 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:58.319779 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" event={"ID":"73aa7185-77ed-4fd2-ae5a-96192fefe723","Type":"ContainerStarted","Data":"c284e59b2fbc891d65995aceeac4e5923ff3a6ac3b52eef991bbe1f68cd01063"} Apr 22 18:48:58.900959 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:58.900685 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:58.900959 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:58.900779 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:48:58.900959 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:58.900865 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.900845796 +0000 UTC m=+144.623605329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : configmap references non-existent config key: service-ca.crt Apr 22 18:48:58.900959 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:58.900902 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:48:58.900959 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:58.900937 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.900926304 +0000 UTC m=+144.623685840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : secret "router-metrics-certs-default" not found Apr 22 18:48:59.002094 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:59.002056 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:48:59.002281 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:59.002262 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:59.002348 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:48:59.002333 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls podName:5e01ef15-d5cc-485f-9813-b674754792b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:01.002311992 +0000 UTC m=+144.725071540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tjgcd" (UID: "5e01ef15-d5cc-485f-9813-b674754792b7") : secret "samples-operator-tls" not found Apr 22 18:48:59.323457 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:59.323418 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" event={"ID":"5e8b4e54-c935-40c6-be8a-d2c22c575aa3","Type":"ContainerStarted","Data":"84bab3aa82129208a8d29b9c74cffeaf7ff37537e82363a4b0c8f6b9a22cd546"} Apr 22 18:48:59.341403 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:48:59.341350 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-2n2sx" podStartSLOduration=1.063132604 podStartE2EDuration="2.341330938s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:48:57.802091491 +0000 UTC m=+141.524851028" lastFinishedPulling="2026-04-22 18:48:59.080289816 +0000 UTC m=+142.803049362" observedRunningTime="2026-04-22 18:48:59.340236432 +0000 UTC m=+143.062995989" watchObservedRunningTime="2026-04-22 18:48:59.341330938 +0000 UTC m=+143.064090495" Apr 22 18:49:00.326402 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:00.326375 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" event={"ID":"705539a2-839e-49d6-b593-4edbd2dce2aa","Type":"ContainerStarted","Data":"41836f3bc34605fcee8b56bb71b734e8740db069df8f3c941d88061d023ff4f7"} Apr 22 18:49:00.327825 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:00.327778 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" event={"ID":"73aa7185-77ed-4fd2-ae5a-96192fefe723","Type":"ContainerStarted","Data":"60cc49653897e78552f0d46a1aebd59126f740a0f64950024da1962e0c3e839d"} Apr 22 18:49:00.344967 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:00.344922 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" podStartSLOduration=0.774288771 podStartE2EDuration="3.344905337s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:48:57.677125126 +0000 UTC m=+141.399884661" lastFinishedPulling="2026-04-22 18:49:00.247741693 +0000 UTC m=+143.970501227" observedRunningTime="2026-04-22 18:49:00.344394981 +0000 UTC m=+144.067154538" watchObservedRunningTime="2026-04-22 18:49:00.344905337 +0000 UTC m=+144.067664892" Apr 22 18:49:00.359928 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:00.359883 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" podStartSLOduration=0.903761694 podStartE2EDuration="3.359868434s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:48:57.789884172 +0000 UTC m=+141.512643706" lastFinishedPulling="2026-04-22 18:49:00.24599091 +0000 UTC m=+143.968750446" observedRunningTime="2026-04-22 18:49:00.359438342 +0000 UTC m=+144.082197901" watchObservedRunningTime="2026-04-22 18:49:00.359868434 +0000 UTC m=+144.082628036" Apr 22 18:49:00.918640 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:00.918607 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:00.918816 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:00.918700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:00.918816 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:00.918753 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:04.918728384 +0000 UTC m=+148.641487918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : configmap references non-existent config key: service-ca.crt Apr 22 18:49:00.918903 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:00.918824 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:49:00.918903 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:00.918878 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:04.918863227 +0000 UTC m=+148.641622766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : secret "router-metrics-certs-default" not found Apr 22 18:49:01.020146 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.020116 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:49:01.020268 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:01.020248 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:49:01.020305 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:01.020294 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls podName:5e01ef15-d5cc-485f-9813-b674754792b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:05.020283414 +0000 UTC m=+148.743042948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tjgcd" (UID: "5e01ef15-d5cc-485f-9813-b674754792b7") : secret "samples-operator-tls" not found Apr 22 18:49:01.758649 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.758616 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7"] Apr 22 18:49:01.762617 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.762588 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" Apr 22 18:49:01.765278 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.765253 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6jzvf\"" Apr 22 18:49:01.765401 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.765253 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:49:01.766268 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.766248 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:49:01.768351 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.768308 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7"] Apr 22 18:49:01.928104 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:01.928075 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qddkx\" (UniqueName: \"kubernetes.io/projected/0dc69694-41de-4c17-9221-d8d5fed0aed2-kube-api-access-qddkx\") pod \"migrator-74bb7799d9-qsss7\" (UID: \"0dc69694-41de-4c17-9221-d8d5fed0aed2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" Apr 22 18:49:02.029127 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:02.029067 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qddkx\" (UniqueName: \"kubernetes.io/projected/0dc69694-41de-4c17-9221-d8d5fed0aed2-kube-api-access-qddkx\") pod \"migrator-74bb7799d9-qsss7\" (UID: \"0dc69694-41de-4c17-9221-d8d5fed0aed2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" Apr 22 18:49:02.037890 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:02.037871 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qddkx\" (UniqueName: \"kubernetes.io/projected/0dc69694-41de-4c17-9221-d8d5fed0aed2-kube-api-access-qddkx\") pod \"migrator-74bb7799d9-qsss7\" (UID: \"0dc69694-41de-4c17-9221-d8d5fed0aed2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" Apr 22 18:49:02.072952 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:02.072927 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" Apr 22 18:49:02.190951 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:02.190870 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7"] Apr 22 18:49:02.193384 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:02.193353 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc69694_41de_4c17_9221_d8d5fed0aed2.slice/crio-d8831ce3c08d26f0f784430f378064b0bbddd6ab31cd69ee076566086db7d7e9 WatchSource:0}: Error finding container d8831ce3c08d26f0f784430f378064b0bbddd6ab31cd69ee076566086db7d7e9: Status 404 returned error can't find the container with id d8831ce3c08d26f0f784430f378064b0bbddd6ab31cd69ee076566086db7d7e9 Apr 22 18:49:02.332389 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:02.332322 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" event={"ID":"0dc69694-41de-4c17-9221-d8d5fed0aed2","Type":"ContainerStarted","Data":"d8831ce3c08d26f0f784430f378064b0bbddd6ab31cd69ee076566086db7d7e9"} Apr 22 18:49:03.925760 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:03.925743 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sp2sj_3453260f-3618-4209-b141-058bfe076e0c/dns-node-resolver/0.log" Apr 22 18:49:04.339060 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:04.339028 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" event={"ID":"0dc69694-41de-4c17-9221-d8d5fed0aed2","Type":"ContainerStarted","Data":"e4447ff83747313ca53a27b43953f98d96916690ff428adfc423817bdb4800f7"} Apr 22 18:49:04.339216 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:04.339065 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" event={"ID":"0dc69694-41de-4c17-9221-d8d5fed0aed2","Type":"ContainerStarted","Data":"27dabcf7e1711bcaa0df8d1d86e05783e1132ce954925d2a9fcaf39389cb544d"} Apr 22 18:49:04.355237 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:04.355187 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qsss7" podStartSLOduration=1.669099981 podStartE2EDuration="3.355171298s" podCreationTimestamp="2026-04-22 18:49:01 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.195309273 +0000 UTC m=+145.918068808" lastFinishedPulling="2026-04-22 18:49:03.88138059 +0000 UTC m=+147.604140125" observedRunningTime="2026-04-22 18:49:04.354966703 +0000 UTC m=+148.077726258" watchObservedRunningTime="2026-04-22 18:49:04.355171298 +0000 UTC m=+148.077930851" Apr 22 18:49:04.523623 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:04.523600 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-grdm6_96308ab5-cbfb-459e-9e75-9d548626286b/node-ca/0.log" Apr 22 18:49:04.948025 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:04.947993 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:04.948412 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:04.948079 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:04.948412 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:04.948129 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:49:04.948412 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:04.948194 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:12.948169545 +0000 UTC m=+156.670929078 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : secret "router-metrics-certs-default" not found Apr 22 18:49:04.948412 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:04.948239 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:12.948224829 +0000 UTC m=+156.670984363 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : configmap references non-existent config key: service-ca.crt Apr 22 18:49:05.048581 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:05.048556 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:49:05.048690 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:05.048647 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:49:05.048690 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:05.048688 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls podName:5e01ef15-d5cc-485f-9813-b674754792b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:13.048677976 +0000 UTC m=+156.771437510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tjgcd" (UID: "5e01ef15-d5cc-485f-9813-b674754792b7") : secret "samples-operator-tls" not found Apr 22 18:49:12.692727 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:12.692684 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6vxjs" podUID="e4ec9d5d-c253-4d72-ba5a-a2af35c106d2" Apr 22 18:49:12.708101 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:12.708072 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qbsd9" podUID="a5c3671b-f180-45d3-aad6-34b06441fbac" Apr 22 18:49:12.884073 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:12.884040 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-z9kwg" podUID="47eec246-c244-4918-8600-48de7568588b" Apr 22 18:49:13.001767 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.001707 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:13.001908 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.001783 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:13.001908 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:13.001902 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle podName:b44c3680-e1d1-4e14-b58a-8dccd8912f42 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:29.001886871 +0000 UTC m=+172.724646405 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle") pod "router-default-578f969574-tk4dr" (UID: "b44c3680-e1d1-4e14-b58a-8dccd8912f42") : configmap references non-existent config key: service-ca.crt Apr 22 18:49:13.004266 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.004235 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44c3680-e1d1-4e14-b58a-8dccd8912f42-metrics-certs\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:13.103010 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.102982 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:49:13.105198 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.105170 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e01ef15-d5cc-485f-9813-b674754792b7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tjgcd\" (UID: \"5e01ef15-d5cc-485f-9813-b674754792b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:49:13.166554 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.166529 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" Apr 22 18:49:13.279959 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.279897 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd"] Apr 22 18:49:13.361392 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.361360 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" event={"ID":"5e01ef15-d5cc-485f-9813-b674754792b7","Type":"ContainerStarted","Data":"04b757748a9fd2bcda33c9331c68d544580ffc0552be18d06902f6369ff0b9eb"} Apr 22 18:49:13.361505 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.361416 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6vxjs" Apr 22 18:49:13.361579 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:13.361565 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:49:15.368346 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:15.368314 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" event={"ID":"5e01ef15-d5cc-485f-9813-b674754792b7","Type":"ContainerStarted","Data":"66ecbb99efb5f364486e93b6151f83a2a199e7a7069ca5f70dad0af3d17461f1"} Apr 22 18:49:15.368346 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:15.368348 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" event={"ID":"5e01ef15-d5cc-485f-9813-b674754792b7","Type":"ContainerStarted","Data":"c99eb84275c212d4fcad5bf7bef93afb8733cc2125be6b53445e338f3509b5db"} Apr 22 18:49:15.390431 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:15.388342 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tjgcd" podStartSLOduration=16.95972272 podStartE2EDuration="18.388325584s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:49:13.327262581 +0000 UTC m=+157.050022115" lastFinishedPulling="2026-04-22 18:49:14.755865424 +0000 UTC m=+158.478624979" observedRunningTime="2026-04-22 18:49:15.386483211 +0000 UTC m=+159.109242767" watchObservedRunningTime="2026-04-22 18:49:15.388325584 +0000 UTC m=+159.111085141" Apr 22 18:49:17.534260 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.534225 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:49:17.534710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.534308 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:49:17.536812 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.536763 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ec9d5d-c253-4d72-ba5a-a2af35c106d2-metrics-tls\") pod \"dns-default-6vxjs\" (UID: \"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2\") " pod="openshift-dns/dns-default-6vxjs" Apr 22 18:49:17.537015 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.536994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c3671b-f180-45d3-aad6-34b06441fbac-cert\") pod \"ingress-canary-qbsd9\" (UID: \"a5c3671b-f180-45d3-aad6-34b06441fbac\") " pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:49:17.564524 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.564504 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w2ls6\"" Apr 22 18:49:17.566634 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.566618 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q94s6\"" Apr 22 18:49:17.572337 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.572315 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qbsd9" Apr 22 18:49:17.572337 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.572332 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6vxjs" Apr 22 18:49:17.695263 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.695235 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6vxjs"] Apr 22 18:49:17.697940 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:17.697913 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ec9d5d_c253_4d72_ba5a_a2af35c106d2.slice/crio-0a26143b768a74caca83fbaf627ccb7f5793b1c6f67374f8d7a1c052e04c6a2e WatchSource:0}: Error finding container 0a26143b768a74caca83fbaf627ccb7f5793b1c6f67374f8d7a1c052e04c6a2e: Status 404 returned error can't find the container with id 0a26143b768a74caca83fbaf627ccb7f5793b1c6f67374f8d7a1c052e04c6a2e Apr 22 18:49:17.718083 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:17.718009 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qbsd9"] Apr 22 18:49:17.720109 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:17.720083 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c3671b_f180_45d3_aad6_34b06441fbac.slice/crio-a2cdf144941cca80619507c3f2e77fbb85eeb2ccb3c061e311c6efe9d43ecd24 WatchSource:0}: Error finding container a2cdf144941cca80619507c3f2e77fbb85eeb2ccb3c061e311c6efe9d43ecd24: Status 404 returned error can't find the container with id a2cdf144941cca80619507c3f2e77fbb85eeb2ccb3c061e311c6efe9d43ecd24 Apr 22 18:49:18.377727 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:18.377669 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6vxjs" event={"ID":"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2","Type":"ContainerStarted","Data":"0a26143b768a74caca83fbaf627ccb7f5793b1c6f67374f8d7a1c052e04c6a2e"} Apr 22 18:49:18.378941 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:18.378914 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qbsd9" event={"ID":"a5c3671b-f180-45d3-aad6-34b06441fbac","Type":"ContainerStarted","Data":"a2cdf144941cca80619507c3f2e77fbb85eeb2ccb3c061e311c6efe9d43ecd24"} Apr 22 18:49:20.385057 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:20.385015 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6vxjs" event={"ID":"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2","Type":"ContainerStarted","Data":"5d194a37d1c3ff969d0640ba7265a0bb79d878e3b9db432d014996b570fe3d9b"} Apr 22 18:49:20.385697 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:20.385066 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6vxjs" event={"ID":"e4ec9d5d-c253-4d72-ba5a-a2af35c106d2","Type":"ContainerStarted","Data":"2f9a5ae8a49787f693347e20d2119cf8fad881c337c38aeedd6ba3718de74df9"} Apr 22 18:49:20.385697 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:20.385160 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6vxjs" Apr 22 18:49:20.386443 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:20.386420 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qbsd9" event={"ID":"a5c3671b-f180-45d3-aad6-34b06441fbac","Type":"ContainerStarted","Data":"55d9dec38ed0b20834b43252fa0f3ce29f4775f5526addb515ed9185a9363339"} Apr 22 18:49:20.403494 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:20.403452 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6vxjs" podStartSLOduration=129.286995681 podStartE2EDuration="2m11.403440125s" podCreationTimestamp="2026-04-22 18:47:09 +0000 UTC" firstStartedPulling="2026-04-22 18:49:17.699735358 +0000 UTC m=+161.422494892" lastFinishedPulling="2026-04-22 18:49:19.816179801 +0000 UTC m=+163.538939336" observedRunningTime="2026-04-22 18:49:20.401402422 +0000 UTC m=+164.124161978" watchObservedRunningTime="2026-04-22 18:49:20.403440125 +0000 UTC m=+164.126199680" Apr 22 18:49:20.417144 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:20.417109 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qbsd9" podStartSLOduration=129.31911135000001 podStartE2EDuration="2m11.417097439s" podCreationTimestamp="2026-04-22 18:47:09 +0000 UTC" firstStartedPulling="2026-04-22 18:49:17.721727013 +0000 UTC m=+161.444486547" lastFinishedPulling="2026-04-22 18:49:19.819713096 +0000 UTC m=+163.542472636" observedRunningTime="2026-04-22 18:49:20.416281954 +0000 UTC m=+164.139041522" watchObservedRunningTime="2026-04-22 18:49:20.417097439 +0000 UTC m=+164.139856995" Apr 22 18:49:23.827782 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.827750 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hd5dg"] Apr 22 18:49:23.831117 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.831093 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:23.834936 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.834913 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:23.835063 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.834913 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-n2rqf\"" Apr 22 18:49:23.835063 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.834915 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:23.842267 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.842241 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hd5dg"] Apr 22 18:49:23.858306 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.858283 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:49:23.979659 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.979633 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/91926044-1ae6-4a59-b70b-8694263f69bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:23.979771 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.979676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/91926044-1ae6-4a59-b70b-8694263f69bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:23.979771 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.979709 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/91926044-1ae6-4a59-b70b-8694263f69bc-crio-socket\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:23.979771 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.979746 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmtk\" (UniqueName: \"kubernetes.io/projected/91926044-1ae6-4a59-b70b-8694263f69bc-kube-api-access-dsmtk\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:23.979921 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:23.979851 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/91926044-1ae6-4a59-b70b-8694263f69bc-data-volume\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080200 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080124 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/91926044-1ae6-4a59-b70b-8694263f69bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080200 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080167 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/91926044-1ae6-4a59-b70b-8694263f69bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080225 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/91926044-1ae6-4a59-b70b-8694263f69bc-crio-socket\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080265 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsmtk\" (UniqueName: \"kubernetes.io/projected/91926044-1ae6-4a59-b70b-8694263f69bc-kube-api-access-dsmtk\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080328 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/91926044-1ae6-4a59-b70b-8694263f69bc-data-volume\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080344 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/91926044-1ae6-4a59-b70b-8694263f69bc-crio-socket\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080654 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/91926044-1ae6-4a59-b70b-8694263f69bc-data-volume\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.080726 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.080692 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/91926044-1ae6-4a59-b70b-8694263f69bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.082611 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.082594 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/91926044-1ae6-4a59-b70b-8694263f69bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.091480 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.091459 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsmtk\" (UniqueName: \"kubernetes.io/projected/91926044-1ae6-4a59-b70b-8694263f69bc-kube-api-access-dsmtk\") pod \"insights-runtime-extractor-hd5dg\" (UID: \"91926044-1ae6-4a59-b70b-8694263f69bc\") " pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.141124 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.141103 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hd5dg" Apr 22 18:49:24.256503 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.256415 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hd5dg"] Apr 22 18:49:24.258877 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:24.258846 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91926044_1ae6_4a59_b70b_8694263f69bc.slice/crio-92bcb6400863b3e6d1ecc5653d9856bfd01fd28a4fb605fedde6b33a9058a6b7 WatchSource:0}: Error finding container 92bcb6400863b3e6d1ecc5653d9856bfd01fd28a4fb605fedde6b33a9058a6b7: Status 404 returned error can't find the container with id 92bcb6400863b3e6d1ecc5653d9856bfd01fd28a4fb605fedde6b33a9058a6b7 Apr 22 18:49:24.402423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.402358 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hd5dg" event={"ID":"91926044-1ae6-4a59-b70b-8694263f69bc","Type":"ContainerStarted","Data":"02a1790366678827733348e82f07d1f4b74691062f3a8c799af5451b58eab4ff"} Apr 22 18:49:24.402423 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:24.402393 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hd5dg" event={"ID":"91926044-1ae6-4a59-b70b-8694263f69bc","Type":"ContainerStarted","Data":"92bcb6400863b3e6d1ecc5653d9856bfd01fd28a4fb605fedde6b33a9058a6b7"} Apr 22 18:49:25.406630 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:25.406590 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hd5dg" event={"ID":"91926044-1ae6-4a59-b70b-8694263f69bc","Type":"ContainerStarted","Data":"da1bd9563981975f396407b8c8e8e5c6d67bceaca64fcdae7ebf9b477ce72517"} Apr 22 18:49:26.410905 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:26.410877 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hd5dg" event={"ID":"91926044-1ae6-4a59-b70b-8694263f69bc","Type":"ContainerStarted","Data":"e157589b165ae9359426f2c87769fd92f599cb9d4a7de4ab0e0080349da638c7"} Apr 22 18:49:26.430129 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:26.430085 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hd5dg" podStartSLOduration=1.57367519 podStartE2EDuration="3.430057575s" podCreationTimestamp="2026-04-22 18:49:23 +0000 UTC" firstStartedPulling="2026-04-22 18:49:24.311558237 +0000 UTC m=+168.034317770" lastFinishedPulling="2026-04-22 18:49:26.167940621 +0000 UTC m=+169.890700155" observedRunningTime="2026-04-22 18:49:26.429366309 +0000 UTC m=+170.152125866" watchObservedRunningTime="2026-04-22 18:49:26.430057575 +0000 UTC m=+170.152817131" Apr 22 18:49:29.016554 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:29.016520 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:29.017115 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:29.017095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44c3680-e1d1-4e14-b58a-8dccd8912f42-service-ca-bundle\") pod \"router-default-578f969574-tk4dr\" (UID: \"b44c3680-e1d1-4e14-b58a-8dccd8912f42\") " pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:29.262808 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:29.262760 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:29.385682 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:29.385655 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-578f969574-tk4dr"] Apr 22 18:49:29.389090 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:29.389063 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44c3680_e1d1_4e14_b58a_8dccd8912f42.slice/crio-d861dd365056748907f213ce66e9f59192a829b22dabec13a573555400f79ec9 WatchSource:0}: Error finding container d861dd365056748907f213ce66e9f59192a829b22dabec13a573555400f79ec9: Status 404 returned error can't find the container with id d861dd365056748907f213ce66e9f59192a829b22dabec13a573555400f79ec9 Apr 22 18:49:29.418997 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:29.418973 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-578f969574-tk4dr" event={"ID":"b44c3680-e1d1-4e14-b58a-8dccd8912f42","Type":"ContainerStarted","Data":"d861dd365056748907f213ce66e9f59192a829b22dabec13a573555400f79ec9"} Apr 22 18:49:30.392875 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:30.392844 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6vxjs" Apr 22 18:49:30.423332 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:30.423294 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-578f969574-tk4dr" event={"ID":"b44c3680-e1d1-4e14-b58a-8dccd8912f42","Type":"ContainerStarted","Data":"5c64b1177ded45f970d3918609f6ee4b107e5690cdfc55ce8f6760ef33a25ee5"} Apr 22 18:49:30.442242 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:30.441749 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-578f969574-tk4dr" podStartSLOduration=33.441731005 podStartE2EDuration="33.441731005s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:30.441363015 +0000 UTC m=+174.164122573" watchObservedRunningTime="2026-04-22 18:49:30.441731005 +0000 UTC m=+174.164490561" Apr 22 18:49:31.263290 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:31.263251 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:31.265733 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:31.265705 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:31.425854 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:31.425822 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:31.427037 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:31.427018 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-578f969574-tk4dr" Apr 22 18:49:38.705532 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.705496 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l"] Apr 22 18:49:38.709963 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.709946 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.712555 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.712527 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:49:38.712799 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.712774 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:38.713868 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.713847 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:49:38.714119 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.713923 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:49:38.714119 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.713943 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2597p\"" Apr 22 18:49:38.714119 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.713928 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:49:38.720398 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.720378 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l"] Apr 22 18:49:38.731838 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.731817 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-k4rql"] Apr 22 18:49:38.734846 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.734828 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.735176 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.735156 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4htns"] Apr 22 18:49:38.737589 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.737566 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:49:38.737833 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.737817 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:49:38.737962 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.737948 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:49:38.738155 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.738141 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-pq6s5\"" Apr 22 18:49:38.738503 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.738487 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.740983 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.740964 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mrmnz\"" Apr 22 18:49:38.741080 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.740964 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:38.741477 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.741448 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:38.741573 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.741479 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:38.748179 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.748150 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-k4rql"] Apr 22 18:49:38.786371 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgs6d\" (UniqueName: \"kubernetes.io/projected/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-api-access-lgs6d\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.786472 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.786472 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786397 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.786472 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786429 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.786472 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786462 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c84610a8-5b8b-417a-9fd8-90e002f5b413-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.786626 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786493 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-root\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786626 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786515 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-wtmp\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786626 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786539 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.786626 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786564 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c84610a8-5b8b-417a-9fd8-90e002f5b413-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.786626 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786579 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-tls\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786626 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786594 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdg4\" (UniqueName: \"kubernetes.io/projected/c87142b9-f4d6-412c-8056-4ca82a20f8a2-kube-api-access-vwdg4\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786625 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-textfile\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786662 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c87142b9-f4d6-412c-8056-4ca82a20f8a2-metrics-client-ca\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786690 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c84610a8-5b8b-417a-9fd8-90e002f5b413-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrmd\" (UniqueName: \"kubernetes.io/projected/c84610a8-5b8b-417a-9fd8-90e002f5b413-kube-api-access-zvrmd\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786737 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786774 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786814 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.786900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.786845 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-sys\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887177 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887152 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrmd\" (UniqueName: \"kubernetes.io/projected/c84610a8-5b8b-417a-9fd8-90e002f5b413-kube-api-access-zvrmd\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.887302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887184 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887222 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887248 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-sys\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887278 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgs6d\" (UniqueName: \"kubernetes.io/projected/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-api-access-lgs6d\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887306 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887335 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887374 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887380 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-sys\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887394 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c84610a8-5b8b-417a-9fd8-90e002f5b413-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887412 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-root\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887462 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-root\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887470 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-wtmp\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887558 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887518 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887572 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c84610a8-5b8b-417a-9fd8-90e002f5b413-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887600 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-tls\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887626 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdg4\" (UniqueName: \"kubernetes.io/projected/c87142b9-f4d6-412c-8056-4ca82a20f8a2-kube-api-access-vwdg4\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887653 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-textfile\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c87142b9-f4d6-412c-8056-4ca82a20f8a2-metrics-client-ca\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887743 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c84610a8-5b8b-417a-9fd8-90e002f5b413-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887743 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.887761 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-wtmp\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.887960 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:38.887867 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:49:38.888414 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:49:38.888106 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-tls podName:c87142b9-f4d6-412c-8056-4ca82a20f8a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:39.38807158 +0000 UTC m=+183.110831115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-tls") pod "node-exporter-4htns" (UID: "c87142b9-f4d6-412c-8056-4ca82a20f8a2") : secret "node-exporter-tls" not found Apr 22 18:49:38.888414 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.888164 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.888414 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.888204 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-accelerators-collector-config\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.888588 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.888558 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c87142b9-f4d6-412c-8056-4ca82a20f8a2-metrics-client-ca\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.888961 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.888906 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c84610a8-5b8b-417a-9fd8-90e002f5b413-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.889076 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.889009 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.889216 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.889189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-textfile\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.890951 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.890926 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.891047 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.890981 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.891047 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.890976 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c84610a8-5b8b-417a-9fd8-90e002f5b413-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.891047 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.891040 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c84610a8-5b8b-417a-9fd8-90e002f5b413-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:38.891218 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.891177 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.897517 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.897492 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgs6d\" (UniqueName: \"kubernetes.io/projected/c1c77ec1-bb46-4c4a-8fa4-efe3f3206330-kube-api-access-lgs6d\") pod \"kube-state-metrics-69db897b98-k4rql\" (UID: \"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:38.897628 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.897609 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdg4\" (UniqueName: \"kubernetes.io/projected/c87142b9-f4d6-412c-8056-4ca82a20f8a2-kube-api-access-vwdg4\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:38.897909 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:38.897888 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrmd\" (UniqueName: \"kubernetes.io/projected/c84610a8-5b8b-417a-9fd8-90e002f5b413-kube-api-access-zvrmd\") pod \"openshift-state-metrics-9d44df66c-kqx8l\" (UID: \"c84610a8-5b8b-417a-9fd8-90e002f5b413\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:39.018870 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.018809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" Apr 22 18:49:39.045506 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.045486 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" Apr 22 18:49:39.148816 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.148684 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l"] Apr 22 18:49:39.152185 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:39.152153 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc84610a8_5b8b_417a_9fd8_90e002f5b413.slice/crio-d7882664eabf753c3e77068182177b907d5d15c107da13f62f450cdf17761108 WatchSource:0}: Error finding container d7882664eabf753c3e77068182177b907d5d15c107da13f62f450cdf17761108: Status 404 returned error can't find the container with id d7882664eabf753c3e77068182177b907d5d15c107da13f62f450cdf17761108 Apr 22 18:49:39.177675 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.177652 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-k4rql"] Apr 22 18:49:39.181864 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:39.181839 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c77ec1_bb46_4c4a_8fa4_efe3f3206330.slice/crio-88dfb195b7d8ea29e656dc7ffb0c6dc8a47be54b8e139d5ec349c39921e96595 WatchSource:0}: Error finding container 88dfb195b7d8ea29e656dc7ffb0c6dc8a47be54b8e139d5ec349c39921e96595: Status 404 returned error can't find the container with id 88dfb195b7d8ea29e656dc7ffb0c6dc8a47be54b8e139d5ec349c39921e96595 Apr 22 18:49:39.391401 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.391372 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-tls\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:39.393718 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.393697 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c87142b9-f4d6-412c-8056-4ca82a20f8a2-node-exporter-tls\") pod \"node-exporter-4htns\" (UID: \"c87142b9-f4d6-412c-8056-4ca82a20f8a2\") " pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:39.446943 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.446907 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" event={"ID":"c84610a8-5b8b-417a-9fd8-90e002f5b413","Type":"ContainerStarted","Data":"ab9e85bf6a9ce4402ad0a9f41352a77817cbb3dec1d617567eabd864d06c2216"} Apr 22 18:49:39.447041 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.446952 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" event={"ID":"c84610a8-5b8b-417a-9fd8-90e002f5b413","Type":"ContainerStarted","Data":"24bee79273c105e077ae392cdcd129e57faf1a410db7f5536c9de7c4ddd32c07"} Apr 22 18:49:39.447041 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.446963 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" event={"ID":"c84610a8-5b8b-417a-9fd8-90e002f5b413","Type":"ContainerStarted","Data":"d7882664eabf753c3e77068182177b907d5d15c107da13f62f450cdf17761108"} Apr 22 18:49:39.447875 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.447853 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" event={"ID":"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330","Type":"ContainerStarted","Data":"88dfb195b7d8ea29e656dc7ffb0c6dc8a47be54b8e139d5ec349c39921e96595"} Apr 22 18:49:39.650226 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:39.650153 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4htns" Apr 22 18:49:39.659365 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:39.659333 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87142b9_f4d6_412c_8056_4ca82a20f8a2.slice/crio-fb381ebcd0babd5b2102974286966c07a6217a83d44c226f8f876f70e7c9eab7 WatchSource:0}: Error finding container fb381ebcd0babd5b2102974286966c07a6217a83d44c226f8f876f70e7c9eab7: Status 404 returned error can't find the container with id fb381ebcd0babd5b2102974286966c07a6217a83d44c226f8f876f70e7c9eab7 Apr 22 18:49:40.452137 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:40.452103 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4htns" event={"ID":"c87142b9-f4d6-412c-8056-4ca82a20f8a2","Type":"ContainerStarted","Data":"fb381ebcd0babd5b2102974286966c07a6217a83d44c226f8f876f70e7c9eab7"} Apr 22 18:49:41.455969 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.455928 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" event={"ID":"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330","Type":"ContainerStarted","Data":"f29e9919c72f4b74c2f45c770f94d555ffec60eee345cb23fbda1b887ee4c968"} Apr 22 18:49:41.456360 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.455972 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" event={"ID":"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330","Type":"ContainerStarted","Data":"75b3b30e41cc99b49d92cd48a432ec9b3dd95a9a2c858f2ed1127f5812ffd5f8"} Apr 22 18:49:41.456360 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.455986 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" event={"ID":"c1c77ec1-bb46-4c4a-8fa4-efe3f3206330","Type":"ContainerStarted","Data":"d60c431f58d6b45375dadf931dfc26265ca1f8c79e2bafcac8b6c2ab1cb3facd"} Apr 22 18:49:41.457512 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.457487 2581 generic.go:358] "Generic (PLEG): container finished" podID="c87142b9-f4d6-412c-8056-4ca82a20f8a2" containerID="7319a1e8c9c6b678759bb64c4a814f05377ea49e302de32cf26e48291e24b0f8" exitCode=0 Apr 22 18:49:41.457624 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.457574 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4htns" event={"ID":"c87142b9-f4d6-412c-8056-4ca82a20f8a2","Type":"ContainerDied","Data":"7319a1e8c9c6b678759bb64c4a814f05377ea49e302de32cf26e48291e24b0f8"} Apr 22 18:49:41.459464 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.459443 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" event={"ID":"c84610a8-5b8b-417a-9fd8-90e002f5b413","Type":"ContainerStarted","Data":"ccc32188971e9d38a543648c5e702a1405e2610f8f9bee3344a8dc3870381508"} Apr 22 18:49:41.475892 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.475852 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-k4rql" podStartSLOduration=2.224256006 podStartE2EDuration="3.475841871s" podCreationTimestamp="2026-04-22 18:49:38 +0000 UTC" firstStartedPulling="2026-04-22 18:49:39.183607341 +0000 UTC m=+182.906366875" lastFinishedPulling="2026-04-22 18:49:40.435193201 +0000 UTC m=+184.157952740" observedRunningTime="2026-04-22 18:49:41.473855975 +0000 UTC m=+185.196615542" watchObservedRunningTime="2026-04-22 18:49:41.475841871 +0000 UTC m=+185.198601426" Apr 22 18:49:41.517489 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.517440 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kqx8l" podStartSLOduration=2.359059068 podStartE2EDuration="3.51742415s" podCreationTimestamp="2026-04-22 18:49:38 +0000 UTC" firstStartedPulling="2026-04-22 18:49:39.278576771 +0000 UTC m=+183.001336310" lastFinishedPulling="2026-04-22 18:49:40.436941843 +0000 UTC m=+184.159701392" observedRunningTime="2026-04-22 18:49:41.516828746 +0000 UTC m=+185.239588301" watchObservedRunningTime="2026-04-22 18:49:41.51742415 +0000 UTC m=+185.240183709" Apr 22 18:49:41.741104 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.741028 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-9b48d7d7-5pr8b"] Apr 22 18:49:41.744666 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.744645 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.747445 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747406 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:49:41.747538 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747406 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:49:41.747538 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747458 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2u2c7i0fuer4a\"" Apr 22 18:49:41.747647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747540 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:49:41.747647 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747412 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:49:41.747866 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747850 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:49:41.747912 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.747885 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-c77ff\"" Apr 22 18:49:41.755004 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.754983 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9b48d7d7-5pr8b"] Apr 22 18:49:41.811436 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811413 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9fpb\" (UniqueName: \"kubernetes.io/projected/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-kube-api-access-k9fpb\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811527 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811461 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-grpc-tls\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811527 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811489 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-metrics-client-ca\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811527 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811517 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811637 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811681 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811637 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811681 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811661 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-tls\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.811681 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.811677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.912715 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.912693 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.912843 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.912727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-tls\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.912843 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.912750 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.912994 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.912912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9fpb\" (UniqueName: \"kubernetes.io/projected/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-kube-api-access-k9fpb\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.912994 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.912968 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-grpc-tls\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.913091 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.913007 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-metrics-client-ca\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.913091 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.913037 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.913197 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.913109 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.914162 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.914028 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-metrics-client-ca\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.920118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.916305 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.920118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.916314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-grpc-tls\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.920118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.916363 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.920118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.916423 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.920118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.916614 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-tls\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.921831 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.921742 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:41.922846 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:41.922827 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9fpb\" (UniqueName: \"kubernetes.io/projected/296c7cc3-d484-4a0d-86fa-1239c66bd7b1-kube-api-access-k9fpb\") pod \"thanos-querier-9b48d7d7-5pr8b\" (UID: \"296c7cc3-d484-4a0d-86fa-1239c66bd7b1\") " pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:42.054348 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:42.054284 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:42.180801 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:42.180762 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9b48d7d7-5pr8b"] Apr 22 18:49:42.185511 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:42.185473 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod296c7cc3_d484_4a0d_86fa_1239c66bd7b1.slice/crio-41cc64736a4831b3f80bf4f388772d217ca065cd41f6881e64c2340560dffa60 WatchSource:0}: Error finding container 41cc64736a4831b3f80bf4f388772d217ca065cd41f6881e64c2340560dffa60: Status 404 returned error can't find the container with id 41cc64736a4831b3f80bf4f388772d217ca065cd41f6881e64c2340560dffa60 Apr 22 18:49:42.462975 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:42.462938 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"41cc64736a4831b3f80bf4f388772d217ca065cd41f6881e64c2340560dffa60"} Apr 22 18:49:42.464822 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:42.464775 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4htns" event={"ID":"c87142b9-f4d6-412c-8056-4ca82a20f8a2","Type":"ContainerStarted","Data":"42f9205be3105f5b2b3c34c5764fed1071c1a1ec0d99b4ebab196ec6ee719ba4"} Apr 22 18:49:42.464928 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:42.464825 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4htns" event={"ID":"c87142b9-f4d6-412c-8056-4ca82a20f8a2","Type":"ContainerStarted","Data":"a0aa6881c9bd6647db880b8210faa703b798420bbb8d35aa0ce7d2f9e4ee27f8"} Apr 22 18:49:42.485486 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:42.485441 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4htns" podStartSLOduration=3.27582738 podStartE2EDuration="4.485429029s" podCreationTimestamp="2026-04-22 18:49:38 +0000 UTC" firstStartedPulling="2026-04-22 18:49:39.6616066 +0000 UTC m=+183.384366135" lastFinishedPulling="2026-04-22 18:49:40.871208249 +0000 UTC m=+184.593967784" observedRunningTime="2026-04-22 18:49:42.484645186 +0000 UTC m=+186.207404755" watchObservedRunningTime="2026-04-22 18:49:42.485429029 +0000 UTC m=+186.208188584" Apr 22 18:49:43.908171 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.908143 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-679d89dd4b-549qs"] Apr 22 18:49:43.911643 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.911622 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.914326 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.914305 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:49:43.914428 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.914322 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:49:43.914428 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.914342 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:49:43.914428 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.914321 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-njp98\"" Apr 22 18:49:43.914632 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.914618 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:49:43.914713 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.914693 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:49:43.920509 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.920487 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:49:43.923013 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.922994 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-679d89dd4b-549qs"] Apr 22 18:49:43.931525 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931505 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-federate-client-tls\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931525 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931534 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-serving-certs-ca-bundle\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931554 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-metrics-client-ca\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931586 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk548\" (UniqueName: \"kubernetes.io/projected/e41550af-02c2-48fe-ac0a-774a42ba00e0-kube-api-access-dk548\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931644 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-secret-telemeter-client\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931728 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-telemeter-client-tls\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:43.931880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:43.931818 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032630 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032766 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032670 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-telemeter-client-tls\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032766 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032699 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032766 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032757 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-federate-client-tls\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032981 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032805 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-serving-certs-ca-bundle\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032981 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032833 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-metrics-client-ca\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032981 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032865 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk548\" (UniqueName: \"kubernetes.io/projected/e41550af-02c2-48fe-ac0a-774a42ba00e0-kube-api-access-dk548\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.032981 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.032891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-secret-telemeter-client\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.033709 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.033633 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.033835 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.033744 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-serving-certs-ca-bundle\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.034149 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.034099 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e41550af-02c2-48fe-ac0a-774a42ba00e0-metrics-client-ca\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.035640 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.035615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-telemeter-client-tls\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.036248 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.036216 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-federate-client-tls\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.036248 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.036239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.037066 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.037044 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e41550af-02c2-48fe-ac0a-774a42ba00e0-secret-telemeter-client\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.041966 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.041947 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk548\" (UniqueName: \"kubernetes.io/projected/e41550af-02c2-48fe-ac0a-774a42ba00e0-kube-api-access-dk548\") pod \"telemeter-client-679d89dd4b-549qs\" (UID: \"e41550af-02c2-48fe-ac0a-774a42ba00e0\") " pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.222178 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.222147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" Apr 22 18:49:44.342481 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.342449 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-679d89dd4b-549qs"] Apr 22 18:49:44.345389 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:44.345358 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode41550af_02c2_48fe_ac0a_774a42ba00e0.slice/crio-fcf46520d4ff7ad944eba405aeba2f3f077194c6018af8a49776040fb51d87b6 WatchSource:0}: Error finding container fcf46520d4ff7ad944eba405aeba2f3f077194c6018af8a49776040fb51d87b6: Status 404 returned error can't find the container with id fcf46520d4ff7ad944eba405aeba2f3f077194c6018af8a49776040fb51d87b6 Apr 22 18:49:44.476900 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.476820 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" event={"ID":"e41550af-02c2-48fe-ac0a-774a42ba00e0","Type":"ContainerStarted","Data":"fcf46520d4ff7ad944eba405aeba2f3f077194c6018af8a49776040fb51d87b6"} Apr 22 18:49:44.478891 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.478855 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"3e3e70882e7f364098ad6bc08cfe1ae887c2c25bc5d87323958fc63316566005"} Apr 22 18:49:44.478891 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.478891 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"13c5e6c731568e86049ef6e1592acc0ac555258c2700341e4beef76e99af2201"} Apr 22 18:49:44.479028 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:44.478902 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"bfd5b773f581b097d4ff2108cccbc776bee0097803804b0783e807ae4c35e8fe"} Apr 22 18:49:45.034255 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.034175 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:45.038556 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.038528 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.041951 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.041926 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qljfn\"" Apr 22 18:49:45.042346 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.042319 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:49:45.042441 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.042397 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:49:45.042441 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.042414 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:49:45.042545 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.042319 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:49:45.043207 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043186 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:49:45.043283 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043208 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:49:45.043283 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043232 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:49:45.043283 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043265 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:49:45.043283 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043265 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:49:45.043494 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043348 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:49:45.044098 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043678 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6tqctot76vrth\"" Apr 22 18:49:45.044098 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.043703 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:49:45.045854 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.045832 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:49:45.050365 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.050343 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:49:45.056962 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.056940 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:45.142130 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142099 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142253 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142139 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-config\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142253 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142168 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcqp\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-kube-api-access-xhcqp\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142253 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142201 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142412 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142265 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-web-config\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142412 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142306 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142412 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142334 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142412 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142364 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142563 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142412 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142563 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142446 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142563 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142501 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142563 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142538 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142752 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142573 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-config-out\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142752 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142638 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142752 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142679 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142752 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142704 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142752 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142740 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.142976 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.142817 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.243935 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.243899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244077 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.243956 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244145 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244127 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-config-out\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244208 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244164 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244208 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244311 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244222 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244311 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244259 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244311 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244302 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244459 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244345 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244459 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244372 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-config\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244459 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244396 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcqp\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-kube-api-access-xhcqp\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244459 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244430 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244639 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244480 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-web-config\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244639 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244639 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244534 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244639 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244639 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244639 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244613 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.244940 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.244910 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.246484 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.245481 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.246484 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.246167 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.249595 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.248381 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.249595 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.248696 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.249595 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.248891 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-config-out\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.249595 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.249521 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.249595 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.249533 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.250050 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.249979 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.250050 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.250032 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.250204 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.250039 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.250302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.250275 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.250842 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.250821 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.250985 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.250963 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.251920 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.251884 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.252018 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.251927 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-config\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.252282 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.252262 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-web-config\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.253715 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.253693 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcqp\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-kube-api-access-xhcqp\") pod \"prometheus-k8s-0\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.351640 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.351577 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:45.487079 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.487037 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"6bbdb6b0c09a7ae5050a04ea1a5e5de5c1bfc9456f49dff544ebc8a7ee114391"} Apr 22 18:49:45.487079 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.487080 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"1c950cce7f9255b9765205512532bc25a803f34ace11a291971d2775b163c9b4"} Apr 22 18:49:45.487285 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.487092 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" event={"ID":"296c7cc3-d484-4a0d-86fa-1239c66bd7b1","Type":"ContainerStarted","Data":"6356d52283077dea53d06e7def2055a86e807c7d1d24841af19f082a4a153ed5"} Apr 22 18:49:45.487285 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.487217 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:45.510574 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.510525 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" podStartSLOduration=1.94826111 podStartE2EDuration="4.510512247s" podCreationTimestamp="2026-04-22 18:49:41 +0000 UTC" firstStartedPulling="2026-04-22 18:49:42.186991117 +0000 UTC m=+185.909750650" lastFinishedPulling="2026-04-22 18:49:44.749242251 +0000 UTC m=+188.472001787" observedRunningTime="2026-04-22 18:49:45.510317744 +0000 UTC m=+189.233077322" watchObservedRunningTime="2026-04-22 18:49:45.510512247 +0000 UTC m=+189.233271804" Apr 22 18:49:45.888722 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:45.888693 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:45.890914 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:49:45.890889 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod536f95f2_b47e_4288_9240_835bbefc9475.slice/crio-d478ca0d4657156579d3a7dcfd3af92529f46c72084e2bc002971964698ceded WatchSource:0}: Error finding container d478ca0d4657156579d3a7dcfd3af92529f46c72084e2bc002971964698ceded: Status 404 returned error can't find the container with id d478ca0d4657156579d3a7dcfd3af92529f46c72084e2bc002971964698ceded Apr 22 18:49:46.492152 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:46.492115 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" event={"ID":"e41550af-02c2-48fe-ac0a-774a42ba00e0","Type":"ContainerStarted","Data":"5bd2c61389c7c92ddb9ee1e32c57bcf40864b08c38e60b346f8d777aa3ed86e9"} Apr 22 18:49:46.493932 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:46.493900 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"d478ca0d4657156579d3a7dcfd3af92529f46c72084e2bc002971964698ceded"} Apr 22 18:49:47.498259 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:47.498217 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" event={"ID":"e41550af-02c2-48fe-ac0a-774a42ba00e0","Type":"ContainerStarted","Data":"9ca88735aaa8bc1159e53218c846839e55a35fb2df2a94b3e598f362676dadbd"} Apr 22 18:49:47.498259 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:47.498264 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" event={"ID":"e41550af-02c2-48fe-ac0a-774a42ba00e0","Type":"ContainerStarted","Data":"54ee2116b7da43dae2d40e12ede1bc281bf3a27fb4c9301395bf48e493ca6bf7"} Apr 22 18:49:47.499561 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:47.499542 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" exitCode=0 Apr 22 18:49:47.499620 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:47.499574 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} Apr 22 18:49:47.522372 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:47.522329 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-679d89dd4b-549qs" podStartSLOduration=1.9942565129999998 podStartE2EDuration="4.522318404s" podCreationTimestamp="2026-04-22 18:49:43 +0000 UTC" firstStartedPulling="2026-04-22 18:49:44.347294048 +0000 UTC m=+188.070053582" lastFinishedPulling="2026-04-22 18:49:46.875355925 +0000 UTC m=+190.598115473" observedRunningTime="2026-04-22 18:49:47.521592035 +0000 UTC m=+191.244351596" watchObservedRunningTime="2026-04-22 18:49:47.522318404 +0000 UTC m=+191.245077960" Apr 22 18:49:50.512123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.512043 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} Apr 22 18:49:50.512123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.512078 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} Apr 22 18:49:50.512123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.512091 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} Apr 22 18:49:50.512123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.512099 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} Apr 22 18:49:50.512123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.512107 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} Apr 22 18:49:50.512123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.512115 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerStarted","Data":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} Apr 22 18:49:50.541482 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:50.541431 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.5057990289999998 podStartE2EDuration="5.541413825s" podCreationTimestamp="2026-04-22 18:49:45 +0000 UTC" firstStartedPulling="2026-04-22 18:49:45.892668775 +0000 UTC m=+189.615428309" lastFinishedPulling="2026-04-22 18:49:49.92828357 +0000 UTC m=+193.651043105" observedRunningTime="2026-04-22 18:49:50.539284243 +0000 UTC m=+194.262043828" watchObservedRunningTime="2026-04-22 18:49:50.541413825 +0000 UTC m=+194.264173382" Apr 22 18:49:51.499675 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:51.499649 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-9b48d7d7-5pr8b" Apr 22 18:49:55.352336 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:49:55.352295 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:06.561453 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:06.561414 2581 generic.go:358] "Generic (PLEG): container finished" podID="705539a2-839e-49d6-b593-4edbd2dce2aa" containerID="41836f3bc34605fcee8b56bb71b734e8740db069df8f3c941d88061d023ff4f7" exitCode=0 Apr 22 18:50:06.561826 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:06.561489 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" event={"ID":"705539a2-839e-49d6-b593-4edbd2dce2aa","Type":"ContainerDied","Data":"41836f3bc34605fcee8b56bb71b734e8740db069df8f3c941d88061d023ff4f7"} Apr 22 18:50:06.561880 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:06.561829 2581 scope.go:117] "RemoveContainer" containerID="41836f3bc34605fcee8b56bb71b734e8740db069df8f3c941d88061d023ff4f7" Apr 22 18:50:07.428669 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:07.428644 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-578f969574-tk4dr_b44c3680-e1d1-4e14-b58a-8dccd8912f42/router/0.log" Apr 22 18:50:07.448382 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:07.448354 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qbsd9_a5c3671b-f180-45d3-aad6-34b06441fbac/serve-healthcheck-canary/0.log" Apr 22 18:50:07.565250 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:07.565219 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qkfrb" event={"ID":"705539a2-839e-49d6-b593-4edbd2dce2aa","Type":"ContainerStarted","Data":"ada399ea6845a4c665909c965a9502b57cd34edd1b894051c11b7f772dbe3d3b"} Apr 22 18:50:26.626075 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:26.626041 2581 generic.go:358] "Generic (PLEG): container finished" podID="73aa7185-77ed-4fd2-ae5a-96192fefe723" containerID="60cc49653897e78552f0d46a1aebd59126f740a0f64950024da1962e0c3e839d" exitCode=0 Apr 22 18:50:26.626445 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:26.626121 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" event={"ID":"73aa7185-77ed-4fd2-ae5a-96192fefe723","Type":"ContainerDied","Data":"60cc49653897e78552f0d46a1aebd59126f740a0f64950024da1962e0c3e839d"} Apr 22 18:50:26.626445 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:26.626393 2581 scope.go:117] "RemoveContainer" containerID="60cc49653897e78552f0d46a1aebd59126f740a0f64950024da1962e0c3e839d" Apr 22 18:50:27.630557 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:27.630524 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-xgv2k" event={"ID":"73aa7185-77ed-4fd2-ae5a-96192fefe723","Type":"ContainerStarted","Data":"5f72dcba28b8ac2570a74764f3defd9f0367fe8d4256888aaf86d0d4d8c5c171"} Apr 22 18:50:45.352291 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:45.352253 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:45.370072 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:45.370044 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:45.710888 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:45.710862 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:47.518673 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:47.518619 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:50:47.521557 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:47.521516 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47eec246-c244-4918-8600-48de7568588b-metrics-certs\") pod \"network-metrics-daemon-z9kwg\" (UID: \"47eec246-c244-4918-8600-48de7568588b\") " pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:50:47.562519 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:47.562495 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qpsz\"" Apr 22 18:50:47.570213 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:47.570194 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9kwg" Apr 22 18:50:47.686598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:47.686494 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z9kwg"] Apr 22 18:50:47.690369 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:50:47.690340 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47eec246_c244_4918_8600_48de7568588b.slice/crio-ad4f233a4bdc2c74f23a9859cf36a435daf694407098924119aaf15e33710ce3 WatchSource:0}: Error finding container ad4f233a4bdc2c74f23a9859cf36a435daf694407098924119aaf15e33710ce3: Status 404 returned error can't find the container with id ad4f233a4bdc2c74f23a9859cf36a435daf694407098924119aaf15e33710ce3 Apr 22 18:50:47.701351 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:47.701309 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9kwg" event={"ID":"47eec246-c244-4918-8600-48de7568588b","Type":"ContainerStarted","Data":"ad4f233a4bdc2c74f23a9859cf36a435daf694407098924119aaf15e33710ce3"} Apr 22 18:50:49.710129 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:49.710092 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9kwg" event={"ID":"47eec246-c244-4918-8600-48de7568588b","Type":"ContainerStarted","Data":"8df6e268339c4317dac0a37cfed48d92bd0da7fde7cc3a36764c1b980986496e"} Apr 22 18:50:49.710129 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:49.710130 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9kwg" event={"ID":"47eec246-c244-4918-8600-48de7568588b","Type":"ContainerStarted","Data":"d956a2c8ae9dfc43c0670d504427de9415a9986e22f6f133d3d7c682247ab045"} Apr 22 18:50:49.728128 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:50:49.728070 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z9kwg" podStartSLOduration=252.779923116 podStartE2EDuration="4m13.72805487s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:50:47.692212111 +0000 UTC m=+251.414971646" lastFinishedPulling="2026-04-22 18:50:48.640343864 +0000 UTC m=+252.363103400" observedRunningTime="2026-04-22 18:50:49.72648556 +0000 UTC m=+253.449245118" watchObservedRunningTime="2026-04-22 18:50:49.72805487 +0000 UTC m=+253.450814426" Apr 22 18:51:03.360426 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360350 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:03.361105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360859 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="prometheus" containerID="cri-o://29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" gracePeriod=600 Apr 22 18:51:03.361105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360894 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="thanos-sidecar" containerID="cri-o://43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" gracePeriod=600 Apr 22 18:51:03.361105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360950 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-web" containerID="cri-o://9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" gracePeriod=600 Apr 22 18:51:03.361105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360894 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy" containerID="cri-o://54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" gracePeriod=600 Apr 22 18:51:03.361105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360900 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" gracePeriod=600 Apr 22 18:51:03.361105 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.360902 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="config-reloader" containerID="cri-o://ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" gracePeriod=600 Apr 22 18:51:03.606968 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.606936 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.624344 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624287 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-tls-assets\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624344 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624318 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-rulefiles-0\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624495 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624347 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-thanos-prometheus-http-client-file\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624495 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624375 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-db\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624495 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624404 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-metrics-client-certs\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624495 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624442 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-tls\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624495 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624468 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-metrics-client-ca\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624503 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-serving-certs-ca-bundle\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624544 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-config\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624580 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-kube-rbac-proxy\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624614 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624651 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-grpc-tls\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624678 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhcqp\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-kube-api-access-xhcqp\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624703 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-web-config\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.624744 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624735 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-trusted-ca-bundle\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.625187 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624769 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-kubelet-serving-ca-bundle\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.625187 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624832 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-config-out\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.625187 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.624860 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"536f95f2-b47e-4288-9240-835bbefc9475\" (UID: \"536f95f2-b47e-4288-9240-835bbefc9475\") " Apr 22 18:51:03.625715 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.625683 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:03.625885 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.625855 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:03.625979 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.625957 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:03.627270 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.626915 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:03.627369 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.627270 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.627426 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.627365 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:03.627478 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.627412 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:03.627976 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.627948 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.628066 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.628033 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:03.628663 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.628634 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-kube-api-access-xhcqp" (OuterVolumeSpecName: "kube-api-access-xhcqp") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "kube-api-access-xhcqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:03.628922 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.628898 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.628995 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.628920 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.629255 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.629217 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.629442 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.629422 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.629713 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.629696 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-config-out" (OuterVolumeSpecName: "config-out") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:03.630017 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.630000 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-config" (OuterVolumeSpecName: "config") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.630554 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.630537 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.642200 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.642171 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-web-config" (OuterVolumeSpecName: "web-config") pod "536f95f2-b47e-4288-9240-835bbefc9475" (UID: "536f95f2-b47e-4288-9240-835bbefc9475"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:03.725985 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.725959 2581 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.725985 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.725984 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-db\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.725995 2581 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-metrics-client-certs\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726004 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726012 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-metrics-client-ca\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726021 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726030 2581 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-config\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726039 2581 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-kube-rbac-proxy\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726048 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726057 2581 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-grpc-tls\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726065 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xhcqp\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-kube-api-access-xhcqp\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726074 2581 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-web-config\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726082 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726090 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726097 2581 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/536f95f2-b47e-4288-9240-835bbefc9475-config-out\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726106 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/536f95f2-b47e-4288-9240-835bbefc9475-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726114 2581 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/536f95f2-b47e-4288-9240-835bbefc9475-tls-assets\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.726161 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.726121 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/536f95f2-b47e-4288-9240-835bbefc9475-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:51:03.757377 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757348 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" exitCode=0 Apr 22 18:51:03.757377 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757371 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" exitCode=0 Apr 22 18:51:03.757377 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757379 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" exitCode=0 Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757386 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" exitCode=0 Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757392 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" exitCode=0 Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757397 2581 generic.go:358] "Generic (PLEG): container finished" podID="536f95f2-b47e-4288-9240-835bbefc9475" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" exitCode=0 Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757428 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757456 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757470 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757483 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757492 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757501 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757509 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757518 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"536f95f2-b47e-4288-9240-835bbefc9475","Type":"ContainerDied","Data":"d478ca0d4657156579d3a7dcfd3af92529f46c72084e2bc002971964698ceded"} Apr 22 18:51:03.757540 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.757537 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.766275 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.766257 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.794396 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.774740 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.796332 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.796305 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:03.797426 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.797405 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:03.798462 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.798446 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.805435 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.805417 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.812342 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.812324 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.812997 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.812976 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:03.813345 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813330 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="config-reloader" Apr 22 18:51:03.813400 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813371 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="config-reloader" Apr 22 18:51:03.813400 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813382 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-web" Apr 22 18:51:03.813400 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813387 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-web" Apr 22 18:51:03.813400 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813397 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="thanos-sidecar" Apr 22 18:51:03.813400 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813403 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="thanos-sidecar" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813438 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="prometheus" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813445 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="prometheus" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813458 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="init-config-reloader" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813465 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="init-config-reloader" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813477 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813509 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813526 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-thanos" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813534 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-thanos" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813637 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="thanos-sidecar" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813653 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="prometheus" Apr 22 18:51:03.813656 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813663 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-web" Apr 22 18:51:03.814121 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813677 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy" Apr 22 18:51:03.814121 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813688 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="kube-rbac-proxy-thanos" Apr 22 18:51:03.814121 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.813700 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="536f95f2-b47e-4288-9240-835bbefc9475" containerName="config-reloader" Apr 22 18:51:03.818933 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.818910 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.819680 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.819659 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.821925 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.821890 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:51:03.821925 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.821907 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:51:03.822084 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.821951 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:51:03.822084 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.821969 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:51:03.822084 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.821909 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:51:03.822233 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822216 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:51:03.822310 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822291 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:51:03.822436 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822420 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:51:03.822519 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822427 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qljfn\"" Apr 22 18:51:03.822519 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822512 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:51:03.822606 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822538 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:51:03.822606 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822582 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:51:03.822894 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.822875 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6tqctot76vrth\"" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827224 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827273 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827313 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827360 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827440 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827479 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-web-config\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827508 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7vr\" (UniqueName: \"kubernetes.io/projected/75c5e756-e7cb-4409-915b-9608df65d5d7-kube-api-access-rp7vr\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827542 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c5e756-e7cb-4409-915b-9608df65d5d7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.827598 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827573 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.828244 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827618 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.828244 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827673 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.828244 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827715 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-config\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.828244 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827746 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.828244 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.827804 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.829980 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.829525 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:51:03.830049 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.829970 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.830111 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.830059 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.830111 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.830095 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.830211 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.830136 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c5e756-e7cb-4409-915b-9608df65d5d7-config-out\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.832749 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.832722 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:03.835160 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.835131 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:51:03.836066 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836037 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.836331 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.836310 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.836389 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836349 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} err="failed to get container status \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" Apr 22 18:51:03.836389 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836379 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.836602 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.836571 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.836602 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836590 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} err="failed to get container status \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" Apr 22 18:51:03.836602 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836603 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.836777 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.836762 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.836928 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836780 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} err="failed to get container status \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" Apr 22 18:51:03.836928 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.836809 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.837008 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.836984 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.837043 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.837020 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} err="failed to get container status \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" Apr 22 18:51:03.837078 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.837043 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.837259 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.837240 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.837302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.837266 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} err="failed to get container status \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" Apr 22 18:51:03.837302 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.837285 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.837514 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.837497 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.837550 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.837521 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} err="failed to get container status \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" Apr 22 18:51:03.837550 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.837535 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.838133 ip-10-0-129-249 kubenswrapper[2581]: E0422 18:51:03.837774 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.838185 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838147 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} err="failed to get container status \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" Apr 22 18:51:03.838185 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838170 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.838392 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838375 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} err="failed to get container status \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" Apr 22 18:51:03.838436 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838392 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.838608 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838584 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} err="failed to get container status \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" Apr 22 18:51:03.838608 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838609 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.838877 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838860 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} err="failed to get container status \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" Apr 22 18:51:03.838922 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.838878 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.839081 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839065 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} err="failed to get container status \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" Apr 22 18:51:03.839120 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839082 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.839310 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839293 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} err="failed to get container status \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" Apr 22 18:51:03.839355 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839311 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.839518 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839502 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} err="failed to get container status \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" Apr 22 18:51:03.839559 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839518 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.839755 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839729 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} err="failed to get container status \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" Apr 22 18:51:03.839831 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839756 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.840002 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.839986 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} err="failed to get container status \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" Apr 22 18:51:03.840067 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840002 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.840210 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840196 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} err="failed to get container status \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" Apr 22 18:51:03.840263 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840211 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.840422 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840404 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} err="failed to get container status \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" Apr 22 18:51:03.840422 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840420 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.840621 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840607 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} err="failed to get container status \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" Apr 22 18:51:03.840621 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840620 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.840848 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840829 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} err="failed to get container status \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" Apr 22 18:51:03.840907 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.840848 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.841046 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841027 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} err="failed to get container status \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" Apr 22 18:51:03.841113 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841049 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.841345 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841299 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} err="failed to get container status \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" Apr 22 18:51:03.841345 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841320 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.841533 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841516 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} err="failed to get container status \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" Apr 22 18:51:03.841573 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841534 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.841732 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841718 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} err="failed to get container status \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" Apr 22 18:51:03.841822 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841732 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.841960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841936 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} err="failed to get container status \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" Apr 22 18:51:03.841960 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.841953 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.842143 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842129 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} err="failed to get container status \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" Apr 22 18:51:03.842143 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842142 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.842313 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842299 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} err="failed to get container status \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" Apr 22 18:51:03.842313 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842312 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.842470 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842455 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} err="failed to get container status \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" Apr 22 18:51:03.842509 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842470 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.842667 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842652 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} err="failed to get container status \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" Apr 22 18:51:03.842667 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842666 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.842855 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842840 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} err="failed to get container status \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" Apr 22 18:51:03.842956 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.842855 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.843086 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843068 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} err="failed to get container status \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" Apr 22 18:51:03.843132 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843087 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.843287 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843272 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} err="failed to get container status \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" Apr 22 18:51:03.843339 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843287 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.843500 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843483 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} err="failed to get container status \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" Apr 22 18:51:03.843541 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843501 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.843702 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843685 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} err="failed to get container status \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" Apr 22 18:51:03.843746 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843701 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.843910 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843894 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} err="failed to get container status \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" Apr 22 18:51:03.843958 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.843909 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.844106 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844091 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} err="failed to get container status \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" Apr 22 18:51:03.844148 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844108 2581 scope.go:117] "RemoveContainer" containerID="c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0" Apr 22 18:51:03.844306 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844290 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0"} err="failed to get container status \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": rpc error: code = NotFound desc = could not find container \"c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0\": container with ID starting with c17178cde3344cdda4232f7d1ab2206a92f9a19274f2642e6220e69800edf1b0 not found: ID does not exist" Apr 22 18:51:03.844306 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844306 2581 scope.go:117] "RemoveContainer" containerID="54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb" Apr 22 18:51:03.844496 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844481 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb"} err="failed to get container status \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": rpc error: code = NotFound desc = could not find container \"54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb\": container with ID starting with 54004bdfb5aecaf7dec9d1dcb862292f025cfb5c9c629d07b7a8df86655135fb not found: ID does not exist" Apr 22 18:51:03.844542 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844495 2581 scope.go:117] "RemoveContainer" containerID="9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a" Apr 22 18:51:03.844682 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844665 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a"} err="failed to get container status \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": rpc error: code = NotFound desc = could not find container \"9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a\": container with ID starting with 9d06ac7d4c635af890126940e4b46b46547062cd96ff213d26887925bd361b6a not found: ID does not exist" Apr 22 18:51:03.844722 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844682 2581 scope.go:117] "RemoveContainer" containerID="43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd" Apr 22 18:51:03.844907 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844890 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd"} err="failed to get container status \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": rpc error: code = NotFound desc = could not find container \"43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd\": container with ID starting with 43a959d497c1312a54fbab4af13e2c488b003db7da15061c63043f505048aedd not found: ID does not exist" Apr 22 18:51:03.844955 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.844908 2581 scope.go:117] "RemoveContainer" containerID="ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0" Apr 22 18:51:03.845111 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.845093 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0"} err="failed to get container status \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": rpc error: code = NotFound desc = could not find container \"ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0\": container with ID starting with ae29baa57e4dc5c71a46bc182582e56883c12a47786f6aa8315e5d3fa68c15a0 not found: ID does not exist" Apr 22 18:51:03.845162 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.845111 2581 scope.go:117] "RemoveContainer" containerID="29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e" Apr 22 18:51:03.845307 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.845284 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e"} err="failed to get container status \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": rpc error: code = NotFound desc = could not find container \"29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e\": container with ID starting with 29d8b61b47f24b106e333f8575721f602d43b8c23a4b635a14f52ad87e28e30e not found: ID does not exist" Apr 22 18:51:03.845359 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.845308 2581 scope.go:117] "RemoveContainer" containerID="e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa" Apr 22 18:51:03.845486 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.845471 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa"} err="failed to get container status \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": rpc error: code = NotFound desc = could not find container \"e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa\": container with ID starting with e1e3371907b9eb1e9ae48ac81a2fa54feb6632563cc60f9a1265df19f6d731aa not found: ID does not exist" Apr 22 18:51:03.931092 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931069 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931183 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931097 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931183 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931183 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931183 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931153 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c5e756-e7cb-4409-915b-9608df65d5d7-config-out\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931183 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931213 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931239 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931275 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931322 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-web-config\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931361 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931342 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7vr\" (UniqueName: \"kubernetes.io/projected/75c5e756-e7cb-4409-915b-9608df65d5d7-kube-api-access-rp7vr\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c5e756-e7cb-4409-915b-9608df65d5d7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931400 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931419 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931434 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931462 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-config\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.931710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.931486 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.932045 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.932008 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.932200 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.932176 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.932514 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.932492 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.933510 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.933487 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.934408 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.934188 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.935007 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.934898 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.935007 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.934898 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.935156 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.935060 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75c5e756-e7cb-4409-915b-9608df65d5d7-config-out\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.935156 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.935079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.935599 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.935561 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.935599 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.935575 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.936196 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.936168 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75c5e756-e7cb-4409-915b-9608df65d5d7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.936399 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.936382 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75c5e756-e7cb-4409-915b-9608df65d5d7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.936583 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.936557 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-web-config\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.937003 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.936983 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-config\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.937046 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.937002 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.937664 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.937647 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75c5e756-e7cb-4409-915b-9608df65d5d7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:03.941352 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:03.941335 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7vr\" (UniqueName: \"kubernetes.io/projected/75c5e756-e7cb-4409-915b-9608df65d5d7-kube-api-access-rp7vr\") pod \"prometheus-k8s-0\" (UID: \"75c5e756-e7cb-4409-915b-9608df65d5d7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:04.134822 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:04.134773 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:04.267118 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:04.267091 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:51:04.268867 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:51:04.268840 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c5e756_e7cb_4409_915b_9608df65d5d7.slice/crio-c2a0fe4bea094a8acd12f2abcb9e60c1cb01390398b85ddca41ddabe8aaf1779 WatchSource:0}: Error finding container c2a0fe4bea094a8acd12f2abcb9e60c1cb01390398b85ddca41ddabe8aaf1779: Status 404 returned error can't find the container with id c2a0fe4bea094a8acd12f2abcb9e60c1cb01390398b85ddca41ddabe8aaf1779 Apr 22 18:51:04.762710 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:04.762675 2581 generic.go:358] "Generic (PLEG): container finished" podID="75c5e756-e7cb-4409-915b-9608df65d5d7" containerID="5a91951e77196563c3f33f72e5a40dbd9e95abef00cb5f76167fca3ae1fcb12f" exitCode=0 Apr 22 18:51:04.763123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:04.762752 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerDied","Data":"5a91951e77196563c3f33f72e5a40dbd9e95abef00cb5f76167fca3ae1fcb12f"} Apr 22 18:51:04.763123 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:04.762774 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"c2a0fe4bea094a8acd12f2abcb9e60c1cb01390398b85ddca41ddabe8aaf1779"} Apr 22 18:51:04.863081 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:04.863054 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536f95f2-b47e-4288-9240-835bbefc9475" path="/var/lib/kubelet/pods/536f95f2-b47e-4288-9240-835bbefc9475/volumes" Apr 22 18:51:05.769051 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.769018 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"40c73fd0b19d0680f87267364fb60ffad4a63c21ba5947f3de660f381cdc5c9d"} Apr 22 18:51:05.769051 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.769054 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"cf82254f086ab94eeb89b4e436624c4bd88d79d19a6d5ac94417969a51d2fc9d"} Apr 22 18:51:05.769454 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.769063 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"347066ebbdbcfff07663bbc1e243239f6334b8da0461ffb7d038bb366ff551e6"} Apr 22 18:51:05.769454 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.769076 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"ef22a17f80a1a5a7f3b6fc59318bc8b4bcc391a04e1f9102483bd3eb37d8c9ab"} Apr 22 18:51:05.769454 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.769084 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"9cbbcdd802ff0267b835840a71c4da78a376d9ca47dcc2d83d4ec4812d45b22a"} Apr 22 18:51:05.769454 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.769093 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75c5e756-e7cb-4409-915b-9608df65d5d7","Type":"ContainerStarted","Data":"b1b6ca7f62fbb79501956994cab4a527044cea47d3c4c41e1438349016faf9bf"} Apr 22 18:51:05.802653 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:05.802593 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.802574583 podStartE2EDuration="2.802574583s" podCreationTimestamp="2026-04-22 18:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:05.800125679 +0000 UTC m=+269.522885236" watchObservedRunningTime="2026-04-22 18:51:05.802574583 +0000 UTC m=+269.525334140" Apr 22 18:51:09.135806 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:09.135767 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:36.741969 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:51:36.741949 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:52:04.135641 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:52:04.135599 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:52:04.153916 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:52:04.153774 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:52:04.969272 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:52:04.969245 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:55:40.406326 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.406243 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-85fcr"] Apr 22 18:55:40.409568 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.409551 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:40.412079 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.412053 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 18:55:40.412240 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.412122 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:55:40.413342 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.413319 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xxtwn\"" Apr 22 18:55:40.413456 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.413325 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:55:40.416291 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.416264 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-85fcr"] Apr 22 18:55:40.431702 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.431679 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zl9w\" (UniqueName: \"kubernetes.io/projected/2262dc7f-57cd-4f21-ac7d-a72eedb9a43c-kube-api-access-8zl9w\") pod \"s3-tls-init-custom-85fcr\" (UID: \"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c\") " pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:40.532264 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.532238 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zl9w\" (UniqueName: \"kubernetes.io/projected/2262dc7f-57cd-4f21-ac7d-a72eedb9a43c-kube-api-access-8zl9w\") pod \"s3-tls-init-custom-85fcr\" (UID: \"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c\") " pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:40.541112 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.541086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zl9w\" (UniqueName: \"kubernetes.io/projected/2262dc7f-57cd-4f21-ac7d-a72eedb9a43c-kube-api-access-8zl9w\") pod \"s3-tls-init-custom-85fcr\" (UID: \"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c\") " pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:40.730081 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.730048 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:40.850708 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.850686 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-85fcr"] Apr 22 18:55:40.852713 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:55:40.852687 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2262dc7f_57cd_4f21_ac7d_a72eedb9a43c.slice/crio-0b17b8d10b1ddd3b12cb23bdd20baef5e46b71ab1c8ec883b70d96c2e4f77c8a WatchSource:0}: Error finding container 0b17b8d10b1ddd3b12cb23bdd20baef5e46b71ab1c8ec883b70d96c2e4f77c8a: Status 404 returned error can't find the container with id 0b17b8d10b1ddd3b12cb23bdd20baef5e46b71ab1c8ec883b70d96c2e4f77c8a Apr 22 18:55:40.854490 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:40.854475 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:55:41.598355 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:41.598314 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-85fcr" event={"ID":"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c","Type":"ContainerStarted","Data":"0b17b8d10b1ddd3b12cb23bdd20baef5e46b71ab1c8ec883b70d96c2e4f77c8a"} Apr 22 18:55:45.613276 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:45.613191 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-85fcr" event={"ID":"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c","Type":"ContainerStarted","Data":"48fe11a23324475b9b67d29eb2947c268f5fae970b722d53ead00b34f0f97502"} Apr 22 18:55:45.630937 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:45.630886 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-85fcr" podStartSLOduration=1.242418778 podStartE2EDuration="5.630873772s" podCreationTimestamp="2026-04-22 18:55:40 +0000 UTC" firstStartedPulling="2026-04-22 18:55:40.854598136 +0000 UTC m=+544.577357670" lastFinishedPulling="2026-04-22 18:55:45.243053127 +0000 UTC m=+548.965812664" observedRunningTime="2026-04-22 18:55:45.62904849 +0000 UTC m=+549.351808046" watchObservedRunningTime="2026-04-22 18:55:45.630873772 +0000 UTC m=+549.353633328" Apr 22 18:55:51.634842 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:51.634808 2581 generic.go:358] "Generic (PLEG): container finished" podID="2262dc7f-57cd-4f21-ac7d-a72eedb9a43c" containerID="48fe11a23324475b9b67d29eb2947c268f5fae970b722d53ead00b34f0f97502" exitCode=0 Apr 22 18:55:51.635222 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:51.634851 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-85fcr" event={"ID":"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c","Type":"ContainerDied","Data":"48fe11a23324475b9b67d29eb2947c268f5fae970b722d53ead00b34f0f97502"} Apr 22 18:55:52.764532 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:52.764509 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:52.832451 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:52.832423 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zl9w\" (UniqueName: \"kubernetes.io/projected/2262dc7f-57cd-4f21-ac7d-a72eedb9a43c-kube-api-access-8zl9w\") pod \"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c\" (UID: \"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c\") " Apr 22 18:55:52.834582 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:52.834558 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2262dc7f-57cd-4f21-ac7d-a72eedb9a43c-kube-api-access-8zl9w" (OuterVolumeSpecName: "kube-api-access-8zl9w") pod "2262dc7f-57cd-4f21-ac7d-a72eedb9a43c" (UID: "2262dc7f-57cd-4f21-ac7d-a72eedb9a43c"). InnerVolumeSpecName "kube-api-access-8zl9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:52.933068 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:52.933048 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zl9w\" (UniqueName: \"kubernetes.io/projected/2262dc7f-57cd-4f21-ac7d-a72eedb9a43c-kube-api-access-8zl9w\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:55:53.643498 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:53.643456 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-85fcr" event={"ID":"2262dc7f-57cd-4f21-ac7d-a72eedb9a43c","Type":"ContainerDied","Data":"0b17b8d10b1ddd3b12cb23bdd20baef5e46b71ab1c8ec883b70d96c2e4f77c8a"} Apr 22 18:55:53.643498 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:53.643491 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-85fcr" Apr 22 18:55:53.643724 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:53.643497 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b17b8d10b1ddd3b12cb23bdd20baef5e46b71ab1c8ec883b70d96c2e4f77c8a" Apr 22 18:55:55.681539 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.681507 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-jsx5s"] Apr 22 18:55:55.681925 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.681873 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2262dc7f-57cd-4f21-ac7d-a72eedb9a43c" containerName="s3-tls-init-custom" Apr 22 18:55:55.681925 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.681886 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2262dc7f-57cd-4f21-ac7d-a72eedb9a43c" containerName="s3-tls-init-custom" Apr 22 18:55:55.681994 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.681948 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2262dc7f-57cd-4f21-ac7d-a72eedb9a43c" containerName="s3-tls-init-custom" Apr 22 18:55:55.684700 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.684683 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:55:55.687167 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.687145 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:55:55.687291 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.687205 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 18:55:55.687699 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.687667 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:55:55.688621 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.688600 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xxtwn\"" Apr 22 18:55:55.696215 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.696184 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-jsx5s"] Apr 22 18:55:55.755300 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.755274 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49gd9\" (UniqueName: \"kubernetes.io/projected/9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97-kube-api-access-49gd9\") pod \"s3-tls-init-serving-jsx5s\" (UID: \"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97\") " pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:55:55.855681 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.855653 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49gd9\" (UniqueName: \"kubernetes.io/projected/9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97-kube-api-access-49gd9\") pod \"s3-tls-init-serving-jsx5s\" (UID: \"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97\") " pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:55:55.864612 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:55.864585 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49gd9\" (UniqueName: \"kubernetes.io/projected/9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97-kube-api-access-49gd9\") pod \"s3-tls-init-serving-jsx5s\" (UID: \"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97\") " pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:55:56.008842 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:56.008749 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:55:56.124688 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:56.124666 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-jsx5s"] Apr 22 18:55:56.126619 ip-10-0-129-249 kubenswrapper[2581]: W0422 18:55:56.126591 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9803f3b0_fae8_4e5a_ac2e_9cbaa3233f97.slice/crio-63de7d2f20ef64b1650ad54f222c0d3d82158521f7a5a81f78e96c6296ce3df3 WatchSource:0}: Error finding container 63de7d2f20ef64b1650ad54f222c0d3d82158521f7a5a81f78e96c6296ce3df3: Status 404 returned error can't find the container with id 63de7d2f20ef64b1650ad54f222c0d3d82158521f7a5a81f78e96c6296ce3df3 Apr 22 18:55:56.653264 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:56.653225 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jsx5s" event={"ID":"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97","Type":"ContainerStarted","Data":"06e6323da4c5ea301a42be32a0f6feaa6d3f685c457f67a79dc6c937bede080a"} Apr 22 18:55:56.653264 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:56.653259 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jsx5s" event={"ID":"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97","Type":"ContainerStarted","Data":"63de7d2f20ef64b1650ad54f222c0d3d82158521f7a5a81f78e96c6296ce3df3"} Apr 22 18:55:56.668985 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:55:56.668940 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-jsx5s" podStartSLOduration=1.668927102 podStartE2EDuration="1.668927102s" podCreationTimestamp="2026-04-22 18:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:55:56.667736925 +0000 UTC m=+560.390496482" watchObservedRunningTime="2026-04-22 18:55:56.668927102 +0000 UTC m=+560.391686657" Apr 22 18:56:00.668083 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:00.668050 2581 generic.go:358] "Generic (PLEG): container finished" podID="9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97" containerID="06e6323da4c5ea301a42be32a0f6feaa6d3f685c457f67a79dc6c937bede080a" exitCode=0 Apr 22 18:56:00.668395 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:00.668128 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jsx5s" event={"ID":"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97","Type":"ContainerDied","Data":"06e6323da4c5ea301a42be32a0f6feaa6d3f685c457f67a79dc6c937bede080a"} Apr 22 18:56:01.802837 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:01.802810 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:56:01.898525 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:01.898495 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49gd9\" (UniqueName: \"kubernetes.io/projected/9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97-kube-api-access-49gd9\") pod \"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97\" (UID: \"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97\") " Apr 22 18:56:01.900589 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:01.900566 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97-kube-api-access-49gd9" (OuterVolumeSpecName: "kube-api-access-49gd9") pod "9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97" (UID: "9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97"). InnerVolumeSpecName "kube-api-access-49gd9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:01.998976 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:01.998920 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49gd9\" (UniqueName: \"kubernetes.io/projected/9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97-kube-api-access-49gd9\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 18:56:02.675526 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:02.675496 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-jsx5s" Apr 22 18:56:02.675526 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:02.675511 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-jsx5s" event={"ID":"9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97","Type":"ContainerDied","Data":"63de7d2f20ef64b1650ad54f222c0d3d82158521f7a5a81f78e96c6296ce3df3"} Apr 22 18:56:02.675717 ip-10-0-129-249 kubenswrapper[2581]: I0422 18:56:02.675544 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63de7d2f20ef64b1650ad54f222c0d3d82158521f7a5a81f78e96c6296ce3df3" Apr 22 19:51:37.642095 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.642058 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkjwr/must-gather-7q4km"] Apr 22 19:51:37.644607 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.642528 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97" containerName="s3-tls-init-serving" Apr 22 19:51:37.644607 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.642544 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97" containerName="s3-tls-init-serving" Apr 22 19:51:37.644607 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.642611 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97" containerName="s3-tls-init-serving" Apr 22 19:51:37.645452 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.645435 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.648160 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.648134 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkjwr\"/\"openshift-service-ca.crt\"" Apr 22 19:51:37.648267 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.648251 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkjwr\"/\"kube-root-ca.crt\"" Apr 22 19:51:37.653119 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.653098 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkjwr/must-gather-7q4km"] Apr 22 19:51:37.809198 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.809155 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzcx\" (UniqueName: \"kubernetes.io/projected/e579da42-7383-47a0-b8f0-2f4cb50c4015-kube-api-access-dzzcx\") pod \"must-gather-7q4km\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.809314 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.809203 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e579da42-7383-47a0-b8f0-2f4cb50c4015-must-gather-output\") pod \"must-gather-7q4km\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.909545 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.909482 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzcx\" (UniqueName: \"kubernetes.io/projected/e579da42-7383-47a0-b8f0-2f4cb50c4015-kube-api-access-dzzcx\") pod \"must-gather-7q4km\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.909545 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.909518 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e579da42-7383-47a0-b8f0-2f4cb50c4015-must-gather-output\") pod \"must-gather-7q4km\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.909935 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.909917 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e579da42-7383-47a0-b8f0-2f4cb50c4015-must-gather-output\") pod \"must-gather-7q4km\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.927386 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.927362 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzcx\" (UniqueName: \"kubernetes.io/projected/e579da42-7383-47a0-b8f0-2f4cb50c4015-kube-api-access-dzzcx\") pod \"must-gather-7q4km\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:37.969413 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:37.969391 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:51:38.086442 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:38.086410 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkjwr/must-gather-7q4km"] Apr 22 19:51:38.088802 ip-10-0-129-249 kubenswrapper[2581]: W0422 19:51:38.088757 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode579da42_7383_47a0_b8f0_2f4cb50c4015.slice/crio-d92aec98bd899ff47662783555de7cbffce89389dd3f68f2aaecaddddd1b9a23 WatchSource:0}: Error finding container d92aec98bd899ff47662783555de7cbffce89389dd3f68f2aaecaddddd1b9a23: Status 404 returned error can't find the container with id d92aec98bd899ff47662783555de7cbffce89389dd3f68f2aaecaddddd1b9a23 Apr 22 19:51:38.090384 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:38.090369 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:51:38.724631 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:38.724596 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkjwr/must-gather-7q4km" event={"ID":"e579da42-7383-47a0-b8f0-2f4cb50c4015","Type":"ContainerStarted","Data":"d92aec98bd899ff47662783555de7cbffce89389dd3f68f2aaecaddddd1b9a23"} Apr 22 19:51:43.742237 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:43.742202 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkjwr/must-gather-7q4km" event={"ID":"e579da42-7383-47a0-b8f0-2f4cb50c4015","Type":"ContainerStarted","Data":"a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43"} Apr 22 19:51:43.742237 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:43.742243 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkjwr/must-gather-7q4km" event={"ID":"e579da42-7383-47a0-b8f0-2f4cb50c4015","Type":"ContainerStarted","Data":"4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c"} Apr 22 19:51:43.762010 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:51:43.761956 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkjwr/must-gather-7q4km" podStartSLOduration=2.118720753 podStartE2EDuration="6.761942658s" podCreationTimestamp="2026-04-22 19:51:37 +0000 UTC" firstStartedPulling="2026-04-22 19:51:38.090504335 +0000 UTC m=+3901.813263869" lastFinishedPulling="2026-04-22 19:51:42.733726225 +0000 UTC m=+3906.456485774" observedRunningTime="2026-04-22 19:51:43.760241796 +0000 UTC m=+3907.483001352" watchObservedRunningTime="2026-04-22 19:51:43.761942658 +0000 UTC m=+3907.484702213" Apr 22 19:52:04.811965 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:04.811887 2581 generic.go:358] "Generic (PLEG): container finished" podID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerID="4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c" exitCode=0 Apr 22 19:52:04.811965 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:04.811939 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkjwr/must-gather-7q4km" event={"ID":"e579da42-7383-47a0-b8f0-2f4cb50c4015","Type":"ContainerDied","Data":"4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c"} Apr 22 19:52:04.812374 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:04.812245 2581 scope.go:117] "RemoveContainer" containerID="4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c" Apr 22 19:52:05.503505 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:05.503470 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xkjwr_must-gather-7q4km_e579da42-7383-47a0-b8f0-2f4cb50c4015/gather/0.log" Apr 22 19:52:09.276468 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:09.276435 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wmlxm_4ae2c297-7264-408e-ba35-12894de1c143/global-pull-secret-syncer/0.log" Apr 22 19:52:09.338433 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:09.338407 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dngnr_b456252b-c126-48fd-ba56-9b92b64d07ce/konnectivity-agent/0.log" Apr 22 19:52:09.441510 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:09.441489 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-249.ec2.internal_0aa354c0cbfbaf0036d5f596d6e0335c/haproxy/0.log" Apr 22 19:52:11.082829 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.082778 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xkjwr/must-gather-7q4km"] Apr 22 19:52:11.083205 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.082995 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xkjwr/must-gather-7q4km" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="copy" containerID="cri-o://a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43" gracePeriod=2 Apr 22 19:52:11.089747 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.089723 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xkjwr/must-gather-7q4km"] Apr 22 19:52:11.304831 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.304807 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xkjwr_must-gather-7q4km_e579da42-7383-47a0-b8f0-2f4cb50c4015/copy/0.log" Apr 22 19:52:11.305153 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.305139 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:52:11.308781 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.308758 2581 status_manager.go:895] "Failed to get status for pod" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" pod="openshift-must-gather-xkjwr/must-gather-7q4km" err="pods \"must-gather-7q4km\" is forbidden: User \"system:node:ip-10-0-129-249.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xkjwr\": no relationship found between node 'ip-10-0-129-249.ec2.internal' and this object" Apr 22 19:52:11.400213 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.400148 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e579da42-7383-47a0-b8f0-2f4cb50c4015-must-gather-output\") pod \"e579da42-7383-47a0-b8f0-2f4cb50c4015\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " Apr 22 19:52:11.400213 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.400196 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzcx\" (UniqueName: \"kubernetes.io/projected/e579da42-7383-47a0-b8f0-2f4cb50c4015-kube-api-access-dzzcx\") pod \"e579da42-7383-47a0-b8f0-2f4cb50c4015\" (UID: \"e579da42-7383-47a0-b8f0-2f4cb50c4015\") " Apr 22 19:52:11.401678 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.401645 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e579da42-7383-47a0-b8f0-2f4cb50c4015-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e579da42-7383-47a0-b8f0-2f4cb50c4015" (UID: "e579da42-7383-47a0-b8f0-2f4cb50c4015"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:52:11.402438 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.402415 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e579da42-7383-47a0-b8f0-2f4cb50c4015-kube-api-access-dzzcx" (OuterVolumeSpecName: "kube-api-access-dzzcx") pod "e579da42-7383-47a0-b8f0-2f4cb50c4015" (UID: "e579da42-7383-47a0-b8f0-2f4cb50c4015"). InnerVolumeSpecName "kube-api-access-dzzcx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:52:11.500803 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.500766 2581 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e579da42-7383-47a0-b8f0-2f4cb50c4015-must-gather-output\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 19:52:11.500896 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.500813 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzzcx\" (UniqueName: \"kubernetes.io/projected/e579da42-7383-47a0-b8f0-2f4cb50c4015-kube-api-access-dzzcx\") on node \"ip-10-0-129-249.ec2.internal\" DevicePath \"\"" Apr 22 19:52:11.833066 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.833043 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xkjwr_must-gather-7q4km_e579da42-7383-47a0-b8f0-2f4cb50c4015/copy/0.log" Apr 22 19:52:11.833306 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.833286 2581 generic.go:358] "Generic (PLEG): container finished" podID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerID="a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43" exitCode=143 Apr 22 19:52:11.833373 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.833353 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkjwr/must-gather-7q4km" Apr 22 19:52:11.833430 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.833383 2581 scope.go:117] "RemoveContainer" containerID="a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43" Apr 22 19:52:11.835961 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.835932 2581 status_manager.go:895] "Failed to get status for pod" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" pod="openshift-must-gather-xkjwr/must-gather-7q4km" err="pods \"must-gather-7q4km\" is forbidden: User \"system:node:ip-10-0-129-249.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xkjwr\": no relationship found between node 'ip-10-0-129-249.ec2.internal' and this object" Apr 22 19:52:11.841273 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.841254 2581 scope.go:117] "RemoveContainer" containerID="4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c" Apr 22 19:52:11.843164 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.843141 2581 status_manager.go:895] "Failed to get status for pod" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" pod="openshift-must-gather-xkjwr/must-gather-7q4km" err="pods \"must-gather-7q4km\" is forbidden: User \"system:node:ip-10-0-129-249.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xkjwr\": no relationship found between node 'ip-10-0-129-249.ec2.internal' and this object" Apr 22 19:52:11.851964 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.851946 2581 scope.go:117] "RemoveContainer" containerID="a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43" Apr 22 19:52:11.852214 ip-10-0-129-249 kubenswrapper[2581]: E0422 19:52:11.852195 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43\": container with ID starting with a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43 not found: ID does not exist" containerID="a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43" Apr 22 19:52:11.852273 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.852221 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43"} err="failed to get container status \"a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43\": rpc error: code = NotFound desc = could not find container \"a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43\": container with ID starting with a48d1ad77d1a6bf94ba0aa7bf4b20e3be4990c3d1099bffc2fa44ff6270ffe43 not found: ID does not exist" Apr 22 19:52:11.852273 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.852247 2581 scope.go:117] "RemoveContainer" containerID="4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c" Apr 22 19:52:11.852472 ip-10-0-129-249 kubenswrapper[2581]: E0422 19:52:11.852451 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c\": container with ID starting with 4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c not found: ID does not exist" containerID="4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c" Apr 22 19:52:11.852512 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:11.852479 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c"} err="failed to get container status \"4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c\": rpc error: code = NotFound desc = could not find container \"4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c\": container with ID starting with 4863b31d3a7c83962cc62abea252f13dd7f7cdd6b4b51c7b5b5603454b8dcf1c not found: ID does not exist" Apr 22 19:52:12.786692 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.786663 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-k4rql_c1c77ec1-bb46-4c4a-8fa4-efe3f3206330/kube-state-metrics/0.log" Apr 22 19:52:12.811412 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.811386 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-k4rql_c1c77ec1-bb46-4c4a-8fa4-efe3f3206330/kube-rbac-proxy-main/0.log" Apr 22 19:52:12.838593 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.838570 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-k4rql_c1c77ec1-bb46-4c4a-8fa4-efe3f3206330/kube-rbac-proxy-self/0.log" Apr 22 19:52:12.862142 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.862121 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" path="/var/lib/kubelet/pods/e579da42-7383-47a0-b8f0-2f4cb50c4015/volumes" Apr 22 19:52:12.940130 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.940113 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4htns_c87142b9-f4d6-412c-8056-4ca82a20f8a2/node-exporter/0.log" Apr 22 19:52:12.962474 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.962414 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4htns_c87142b9-f4d6-412c-8056-4ca82a20f8a2/kube-rbac-proxy/0.log" Apr 22 19:52:12.986464 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:12.986444 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4htns_c87142b9-f4d6-412c-8056-4ca82a20f8a2/init-textfile/0.log" Apr 22 19:52:13.190395 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.190327 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kqx8l_c84610a8-5b8b-417a-9fd8-90e002f5b413/kube-rbac-proxy-main/0.log" Apr 22 19:52:13.219436 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.219360 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kqx8l_c84610a8-5b8b-417a-9fd8-90e002f5b413/kube-rbac-proxy-self/0.log" Apr 22 19:52:13.246408 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.246387 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kqx8l_c84610a8-5b8b-417a-9fd8-90e002f5b413/openshift-state-metrics/0.log" Apr 22 19:52:13.306314 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.306287 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/prometheus/0.log" Apr 22 19:52:13.327667 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.327640 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/config-reloader/0.log" Apr 22 19:52:13.354348 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.354326 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/thanos-sidecar/0.log" Apr 22 19:52:13.382136 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.382115 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/kube-rbac-proxy-web/0.log" Apr 22 19:52:13.408761 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.408737 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/kube-rbac-proxy/0.log" Apr 22 19:52:13.433694 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.433678 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/kube-rbac-proxy-thanos/0.log" Apr 22 19:52:13.460223 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.460204 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75c5e756-e7cb-4409-915b-9608df65d5d7/init-config-reloader/0.log" Apr 22 19:52:13.596186 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.596145 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-679d89dd4b-549qs_e41550af-02c2-48fe-ac0a-774a42ba00e0/telemeter-client/0.log" Apr 22 19:52:13.621164 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.621142 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-679d89dd4b-549qs_e41550af-02c2-48fe-ac0a-774a42ba00e0/reload/0.log" Apr 22 19:52:13.647668 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.647641 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-679d89dd4b-549qs_e41550af-02c2-48fe-ac0a-774a42ba00e0/kube-rbac-proxy/0.log" Apr 22 19:52:13.685877 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.685848 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9b48d7d7-5pr8b_296c7cc3-d484-4a0d-86fa-1239c66bd7b1/thanos-query/0.log" Apr 22 19:52:13.715977 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.715957 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9b48d7d7-5pr8b_296c7cc3-d484-4a0d-86fa-1239c66bd7b1/kube-rbac-proxy-web/0.log" Apr 22 19:52:13.744732 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.744705 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9b48d7d7-5pr8b_296c7cc3-d484-4a0d-86fa-1239c66bd7b1/kube-rbac-proxy/0.log" Apr 22 19:52:13.769644 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.769623 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9b48d7d7-5pr8b_296c7cc3-d484-4a0d-86fa-1239c66bd7b1/prom-label-proxy/0.log" Apr 22 19:52:13.794611 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.794590 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9b48d7d7-5pr8b_296c7cc3-d484-4a0d-86fa-1239c66bd7b1/kube-rbac-proxy-rules/0.log" Apr 22 19:52:13.821230 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:13.821173 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-9b48d7d7-5pr8b_296c7cc3-d484-4a0d-86fa-1239c66bd7b1/kube-rbac-proxy-metrics/0.log" Apr 22 19:52:16.274922 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.274889 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg"] Apr 22 19:52:16.275297 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.275280 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="copy" Apr 22 19:52:16.275388 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.275299 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="copy" Apr 22 19:52:16.275388 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.275313 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="gather" Apr 22 19:52:16.275388 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.275319 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="gather" Apr 22 19:52:16.275388 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.275382 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="gather" Apr 22 19:52:16.275574 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.275397 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e579da42-7383-47a0-b8f0-2f4cb50c4015" containerName="copy" Apr 22 19:52:16.279927 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.279906 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.282602 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.282582 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8tvdg\"/\"kube-root-ca.crt\"" Apr 22 19:52:16.282702 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.282582 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8tvdg\"/\"default-dockercfg-ksxp9\"" Apr 22 19:52:16.283933 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.283915 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8tvdg\"/\"openshift-service-ca.crt\"" Apr 22 19:52:16.288961 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.288936 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg"] Apr 22 19:52:16.336825 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.336775 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-podres\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.336825 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.336828 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-sys\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.337033 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.336972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-lib-modules\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.337033 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.337007 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-proc\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.337124 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.337043 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bv6\" (UniqueName: \"kubernetes.io/projected/b04eac24-0822-47fb-b1ce-20c7e4673f5f-kube-api-access-x7bv6\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.347696 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.347664 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-2n2sx_5e8b4e54-c935-40c6-be8a-d2c22c575aa3/volume-data-source-validator/0.log" Apr 22 19:52:16.437691 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437666 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-lib-modules\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437814 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437694 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-proc\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437814 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437715 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bv6\" (UniqueName: \"kubernetes.io/projected/b04eac24-0822-47fb-b1ce-20c7e4673f5f-kube-api-access-x7bv6\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437814 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437737 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-podres\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437970 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437826 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-proc\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437970 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437858 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-podres\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437970 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437860 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-lib-modules\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.437970 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437880 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-sys\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.438112 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.437974 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04eac24-0822-47fb-b1ce-20c7e4673f5f-sys\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.458057 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.458029 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bv6\" (UniqueName: \"kubernetes.io/projected/b04eac24-0822-47fb-b1ce-20c7e4673f5f-kube-api-access-x7bv6\") pod \"perf-node-gather-daemonset-rmdkg\" (UID: \"b04eac24-0822-47fb-b1ce-20c7e4673f5f\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.590834 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.590761 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.707500 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.707387 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg"] Apr 22 19:52:16.710261 ip-10-0-129-249 kubenswrapper[2581]: W0422 19:52:16.710226 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb04eac24_0822_47fb_b1ce_20c7e4673f5f.slice/crio-6f4fec6c8c159d203bd0d53c1df6f35387daade03099609a356f99731fa8f326 WatchSource:0}: Error finding container 6f4fec6c8c159d203bd0d53c1df6f35387daade03099609a356f99731fa8f326: Status 404 returned error can't find the container with id 6f4fec6c8c159d203bd0d53c1df6f35387daade03099609a356f99731fa8f326 Apr 22 19:52:16.848981 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.848909 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" event={"ID":"b04eac24-0822-47fb-b1ce-20c7e4673f5f","Type":"ContainerStarted","Data":"7b41394b39eeca12234dfa358d4c2a22363306492ecc9d548621d9a4185d7e86"} Apr 22 19:52:16.848981 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.848949 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" event={"ID":"b04eac24-0822-47fb-b1ce-20c7e4673f5f","Type":"ContainerStarted","Data":"6f4fec6c8c159d203bd0d53c1df6f35387daade03099609a356f99731fa8f326"} Apr 22 19:52:16.849166 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.849076 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:16.865469 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:16.865421 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" podStartSLOduration=0.865407047 podStartE2EDuration="865.407047ms" podCreationTimestamp="2026-04-22 19:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:52:16.86451455 +0000 UTC m=+3940.587274105" watchObservedRunningTime="2026-04-22 19:52:16.865407047 +0000 UTC m=+3940.588166604" Apr 22 19:52:17.180271 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:17.180234 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6vxjs_e4ec9d5d-c253-4d72-ba5a-a2af35c106d2/dns/0.log" Apr 22 19:52:17.209425 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:17.209399 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6vxjs_e4ec9d5d-c253-4d72-ba5a-a2af35c106d2/kube-rbac-proxy/0.log" Apr 22 19:52:17.345713 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:17.345683 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sp2sj_3453260f-3618-4209-b141-058bfe076e0c/dns-node-resolver/0.log" Apr 22 19:52:17.851058 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:17.851029 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-grdm6_96308ab5-cbfb-459e-9e75-9d548626286b/node-ca/0.log" Apr 22 19:52:18.697225 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:18.697190 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-578f969574-tk4dr_b44c3680-e1d1-4e14-b58a-8dccd8912f42/router/0.log" Apr 22 19:52:19.114501 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:19.114425 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qbsd9_a5c3671b-f180-45d3-aad6-34b06441fbac/serve-healthcheck-canary/0.log" Apr 22 19:52:19.501973 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:19.501947 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qkfrb_705539a2-839e-49d6-b593-4edbd2dce2aa/insights-operator/0.log" Apr 22 19:52:19.503063 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:19.503042 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qkfrb_705539a2-839e-49d6-b593-4edbd2dce2aa/insights-operator/1.log" Apr 22 19:52:19.619215 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:19.619184 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hd5dg_91926044-1ae6-4a59-b70b-8694263f69bc/kube-rbac-proxy/0.log" Apr 22 19:52:19.645126 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:19.645104 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hd5dg_91926044-1ae6-4a59-b70b-8694263f69bc/exporter/0.log" Apr 22 19:52:19.693877 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:19.693855 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hd5dg_91926044-1ae6-4a59-b70b-8694263f69bc/extractor/0.log" Apr 22 19:52:22.242367 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:22.242340 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-85fcr_2262dc7f-57cd-4f21-ac7d-a72eedb9a43c/s3-tls-init-custom/0.log" Apr 22 19:52:22.269330 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:22.269305 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-jsx5s_9803f3b0-fae8-4e5a-ac2e-9cbaa3233f97/s3-tls-init-serving/0.log" Apr 22 19:52:22.862483 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:22.862461 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-rmdkg" Apr 22 19:52:26.679468 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:26.679438 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qsss7_0dc69694-41de-4c17-9221-d8d5fed0aed2/migrator/0.log" Apr 22 19:52:26.703162 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:26.703134 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qsss7_0dc69694-41de-4c17-9221-d8d5fed0aed2/graceful-termination/0.log" Apr 22 19:52:28.237614 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.237579 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/kube-multus-additional-cni-plugins/0.log" Apr 22 19:52:28.262434 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.262406 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/egress-router-binary-copy/0.log" Apr 22 19:52:28.290748 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.290728 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/cni-plugins/0.log" Apr 22 19:52:28.315476 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.315453 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/bond-cni-plugin/0.log" Apr 22 19:52:28.339045 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.339017 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/routeoverride-cni/0.log" Apr 22 19:52:28.367107 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.367088 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/whereabouts-cni-bincopy/0.log" Apr 22 19:52:28.393588 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.393568 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dhp4b_63aaeb86-a51a-4444-93df-19041d851cd6/whereabouts-cni/0.log" Apr 22 19:52:28.618328 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.618252 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dvr84_add2dc4a-bd5c-417c-91dd-132eb3de7087/kube-multus/0.log" Apr 22 19:52:28.804727 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.804698 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z9kwg_47eec246-c244-4918-8600-48de7568588b/network-metrics-daemon/0.log" Apr 22 19:52:28.827017 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:28.826997 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z9kwg_47eec246-c244-4918-8600-48de7568588b/kube-rbac-proxy/0.log" Apr 22 19:52:29.622477 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.622426 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/ovn-controller/0.log" Apr 22 19:52:29.679982 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.679952 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/ovn-acl-logging/0.log" Apr 22 19:52:29.708283 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.708256 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/kube-rbac-proxy-node/0.log" Apr 22 19:52:29.736040 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.736011 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:52:29.756674 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.756648 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/northd/0.log" Apr 22 19:52:29.782913 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.782892 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/nbdb/0.log" Apr 22 19:52:29.807931 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.807892 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/sbdb/0.log" Apr 22 19:52:29.989698 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:29.989660 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b9cf6_b77cfc29-6e5e-47f5-b607-aa33e5a172af/ovnkube-controller/0.log" Apr 22 19:52:31.686085 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:31.686057 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b4wkh_2ee4ece0-4e59-4b13-a6f7-140e212f2fd7/network-check-target-container/0.log" Apr 22 19:52:32.679643 ip-10-0-129-249 kubenswrapper[2581]: I0422 19:52:32.679605 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qh2jx_697a8ed9-86fe-434b-9bc5-3296e657ff3d/iptables-alerter/0.log"