Apr 21 10:01:12.835101 ip-10-0-132-46 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 10:01:12.835116 ip-10-0-132-46 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 10:01:12.835126 ip-10-0-132-46 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 10:01:12.835449 ip-10-0-132-46 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 10:01:22.942289 ip-10-0-132-46 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 10:01:22.942306 ip-10-0-132-46 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ae261cbd828e4dd484f1b0ae7931f62b -- Apr 21 10:03:55.487682 ip-10-0-132-46 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:55.944574 ip-10-0-132-46 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:55.944574 ip-10-0-132-46 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:55.944574 ip-10-0-132-46 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:55.944574 ip-10-0-132-46 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:55.944574 ip-10-0-132-46 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:55.946547 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.946415 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:55.949573 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949558 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:55.949573 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949572 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949576 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949579 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949582 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949585 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949588 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949590 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949593 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949595 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949598 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949601 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949604 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949607 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949609 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949612 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949619 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949621 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949624 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949626 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:55.949636 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949631 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949634 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949638 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949641 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949644 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949647 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949650 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949652 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949655 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949658 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949660 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949662 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949665 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949667 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949670 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949672 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949675 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949677 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949680 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949682 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:55.950095 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949685 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949687 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949690 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949692 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949695 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949697 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949700 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949702 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949706 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949710 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949713 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949715 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949717 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949720 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949723 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949726 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949728 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949731 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949733 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:55.950610 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949736 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949739 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949741 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949744 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949746 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949749 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949752 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949754 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949757 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949759 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949762 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949764 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949767 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949769 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949772 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949774 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949776 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949779 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949782 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949784 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:55.951091 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949787 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949789 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949791 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949794 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949796 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949799 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.949801 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950178 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950183 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950186 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950189 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950192 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950195 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950198 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950200 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950204 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950208 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950211 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950214 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950217 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:55.951580 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950220 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950222 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950224 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950227 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950229 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950232 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950234 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950237 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950239 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950242 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950244 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950247 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950249 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950251 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950254 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950256 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950259 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950262 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950264 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950267 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:55.952078 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950270 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950272 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950274 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950277 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950279 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950282 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950284 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950287 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950289 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950292 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950295 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950298 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950301 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950303 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950305 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950308 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950310 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950313 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950315 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950317 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:55.952614 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950321 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950323 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950325 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950329 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950331 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950334 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950336 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950339 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950342 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950346 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950349 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950353 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950356 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950360 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950363 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950366 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950368 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950371 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950374 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:55.953104 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950377 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950380 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950382 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950385 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950404 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950407 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950410 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950413 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950415 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950418 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950421 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950423 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950426 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.950428 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951088 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951097 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951104 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951109 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951113 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951116 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951121 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:55.953595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951125 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951129 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951132 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951135 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951138 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951141 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951145 2577 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951148 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951151 2577 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951153 2577 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951156 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951159 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951163 2577 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951166 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951169 2577 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951172 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951175 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951179 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951182 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951185 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951188 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951191 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951194 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951197 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951200 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:55.954093 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951203 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951207 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951210 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951213 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951216 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951219 2577 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951221 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951226 2577 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951229 2577 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951232 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951235 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951238 2577 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951242 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951245 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951248 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951251 2577 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951253 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951256 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951259 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951262 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951265 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951267 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951270 2577 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951274 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951277 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:55.954757 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951280 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951283 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951286 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951289 2577 flags.go:64] FLAG: --help="false" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951292 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-132-46.ec2.internal" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951296 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951299 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951302 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951306 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951309 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951312 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951315 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951318 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951321 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951324 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951327 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951330 2577 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951332 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951335 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951338 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951341 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951344 2577 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951347 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951350 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:55.955363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951353 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951358 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951361 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951364 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951366 2577 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951369 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951372 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951375 2577 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951378 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951382 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951385 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951402 2577 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951405 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951409 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951411 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951414 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951417 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951420 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951423 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951430 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951433 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951436 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951438 2577 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:55.955958 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951441 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951447 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951450 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951453 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951456 2577 flags.go:64] FLAG: --port="10250" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951459 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951461 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01845dc729937549d" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951464 2577 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951467 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951470 2577 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951473 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951476 2577 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951480 2577 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951482 2577 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951485 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951487 2577 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951491 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951494 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951497 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951500 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951503 2577 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951506 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951509 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951512 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951515 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951518 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:55.956521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951521 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951523 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951526 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951529 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951532 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951534 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951537 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951540 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951543 2577 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951546 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951551 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951554 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951557 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951560 2577 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951563 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951566 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951569 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951571 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951574 2577 flags.go:64] FLAG: --v="2" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951578 2577 flags.go:64] FLAG: --version="false" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951582 2577 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951586 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.951589 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951679 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951683 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:55.957147 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951687 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951690 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951693 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951695 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951698 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951700 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951703 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951705 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951708 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951710 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951713 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951715 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951718 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951720 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951724 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951727 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951730 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951733 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951735 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:55.957781 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951738 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951741 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951744 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951747 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951749 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951752 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951755 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951757 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951759 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951762 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951764 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951767 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951769 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951772 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951774 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951777 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951780 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951782 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951785 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951787 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951789 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:55.958254 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951792 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951794 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951796 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951799 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951801 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951804 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951806 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951808 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951811 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951813 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951815 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951818 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951823 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951825 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951828 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951830 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951833 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951835 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951839 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951842 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:55.959051 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951845 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951847 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951850 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951853 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951855 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951858 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951861 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951863 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951866 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951868 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951871 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951873 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951875 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951878 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951880 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951883 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951885 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951888 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951890 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951893 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:55.959825 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951895 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951897 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951900 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.951903 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.952510 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.959868 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.959883 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959930 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959934 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959937 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959940 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959943 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959947 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959950 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959953 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:55.960309 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959955 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959958 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959961 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959963 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959966 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959968 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959972 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959976 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959978 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959981 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959983 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959986 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959989 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959991 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959994 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959996 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.959999 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960002 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960004 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:55.960711 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960007 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960016 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960018 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960021 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960024 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960027 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960030 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960032 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960035 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960037 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960040 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960042 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960045 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960047 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960050 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960053 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960055 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960058 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960060 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960064 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:55.961175 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960068 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960070 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960073 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960076 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960079 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960082 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960084 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960087 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960090 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960093 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960095 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960098 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960101 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960103 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960106 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960109 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960113 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960115 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960118 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960120 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:55.961722 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960123 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960125 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960128 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960131 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960133 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960135 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960138 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960140 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960143 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960146 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960149 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960151 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960154 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960156 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960158 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960161 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960164 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960166 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:55.962239 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960168 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.960174 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960267 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960272 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960275 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960278 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960281 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960284 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960286 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960290 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960295 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960298 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960301 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960303 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960306 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:55.962685 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960308 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960311 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960314 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960316 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960320 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960323 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960325 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960327 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960330 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960333 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960335 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960338 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960340 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960343 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960345 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960347 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960350 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960352 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960354 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960357 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:55.963128 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960359 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960361 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960364 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960366 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960368 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960371 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960373 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960376 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960378 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960381 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960384 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960386 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960404 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960407 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960409 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960412 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960414 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960416 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960419 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960421 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:55.963679 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960424 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960426 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960429 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960432 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960434 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960436 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960439 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960441 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960443 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960446 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960448 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960451 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960453 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960457 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960460 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960462 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960465 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960467 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960470 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:55.964159 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960472 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960475 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960477 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960480 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960482 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960484 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960487 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960489 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960492 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960494 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960497 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960499 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960501 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:55.960504 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.960509 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:55.964680 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.960619 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:55.965044 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.964264 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:55.965268 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.965257 2577 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:55.965381 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.965363 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:55.965460 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.965436 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:55.993723 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.993696 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:55.995404 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:55.995374 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:56.013538 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.013517 2577 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:56.020480 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.020462 2577 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:56.021817 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.021790 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:56.024685 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.024652 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:56.027353 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.027334 2577 fs.go:135] Filesystem UUIDs: map[3d6dc116-ffb8-4a3a-8e3c-c23e08434864:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b2c1634a-853b-489c-80fd-5c1e2b9a92f0:/dev/nvme0n1p4] Apr 21 10:03:56.027429 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.027352 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:56.033451 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.033333 2577 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:56.031362686 +0000 UTC m=+0.425951952 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3089451 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a22da579deeb715c0d7d48c03907b SystemUUID:ec2a22da-579d-eeb7-15c0-d7d48c03907b BootID:ae261cbd-828e-4dd4-84f1-b0ae7931f62b Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:15:18:2a:ab:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:15:18:2a:ab:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:78:3f:b7:af:a1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:56.033451 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.033447 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:56.033555 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.033523 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:56.034952 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.034933 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:56.035091 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.034955 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-46.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:56.035131 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.035100 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:56.035131 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.035109 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:56.035131 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.035121 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:56.035941 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.035931 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:56.037352 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.037342 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:56.037526 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.037515 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:56.040201 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.040192 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:56.040247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.040212 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:56.040247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.040225 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:56.040247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.040237 2577 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:56.040247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.040247 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:56.041114 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.041098 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vjkcs" Apr 21 10:03:56.041569 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.041559 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:56.041606 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.041577 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:56.044984 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.044962 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:56.046273 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.046258 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:56.048115 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048095 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:56.048115 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048113 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:56.048115 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048119 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048125 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048131 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048137 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048142 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048147 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048154 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048160 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048168 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:56.048247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.048176 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:56.049199 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.049185 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vjkcs" Apr 21 10:03:56.049252 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.049242 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:56.049285 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.049265 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:56.050955 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.050922 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 10:03:56.051020 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.050960 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-46.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 10:03:56.052904 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.052891 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:56.052975 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.052932 2577 server.go:1295] "Started kubelet" Apr 21 10:03:56.053080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.053030 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:56.053155 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.053096 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:56.053188 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.053173 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:56.053927 ip-10-0-132-46 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:56.055004 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.054982 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:56.056110 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.056090 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:56.062773 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.062754 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-46.ec2.internal" not found Apr 21 10:03:56.064687 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.064643 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:56.065305 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.065289 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:56.066135 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066112 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:56.066135 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066114 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:56.066256 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066145 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:56.066302 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066296 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:56.066345 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066305 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:56.066493 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066472 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:56.066493 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066490 2577 factory.go:55] Registering systemd factory Apr 21 10:03:56.066617 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066499 2577 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:56.066617 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.066501 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.066878 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066864 2577 factory.go:153] Registering CRI-O factory Apr 21 10:03:56.066878 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066880 2577 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:56.066993 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066903 2577 factory.go:103] Registering Raw factory Apr 21 10:03:56.066993 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.066916 2577 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:56.067067 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.067022 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:56.067513 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.067489 2577 manager.go:319] Starting recovery of all containers Apr 21 10:03:56.067608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.067593 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:56.069984 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.069966 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-46.ec2.internal\" not found" node="ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.077924 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.077776 2577 manager.go:324] Recovery completed Apr 21 10:03:56.080801 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.080776 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-46.ec2.internal" not found Apr 21 10:03:56.082171 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.082147 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:56.085606 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.085589 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:56.085676 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.085617 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:56.085676 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.085632 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:56.086054 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.086039 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:56.086054 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.086054 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:56.086134 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.086069 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:56.088180 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.088169 2577 policy_none.go:49] "None policy: Start" Apr 21 10:03:56.088214 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.088185 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:56.088214 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.088194 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.127617 2577 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.127643 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.127654 2577 server.go:85] "Starting device plugin registration server" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.127885 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.127895 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.127983 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.128065 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.128075 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.128615 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.128650 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.145459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.136262 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-46.ec2.internal" not found Apr 21 10:03:56.197842 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.197783 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:56.199069 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.199045 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:56.199069 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.199071 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:56.199190 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.199088 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:56.199190 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.199095 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:56.199190 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.199125 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 10:03:56.203073 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.203056 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:56.228820 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.228803 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:56.229564 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.229551 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:56.229629 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.229576 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:56.229629 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.229587 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:56.229629 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.229608 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.239527 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.239500 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.239527 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.239527 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-46.ec2.internal\": node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.258664 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.258645 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.300060 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.300039 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal"] Apr 21 10:03:56.300112 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.300096 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:56.301642 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.301629 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:56.301694 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.301655 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:56.301694 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.301664 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:56.303044 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303033 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:56.303186 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303173 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.303233 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303205 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:56.303672 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303659 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:56.303747 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303677 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:56.303747 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303686 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:56.303747 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303731 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:56.303842 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303755 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:56.303842 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.303775 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:56.305007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.304986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.305098 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.305015 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:56.305641 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.305624 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:56.305727 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.305659 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:56.305727 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.305672 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:56.331931 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.331918 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-46.ec2.internal\" not found" node="ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.336205 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.336188 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-46.ec2.internal\" not found" node="ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.359453 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.359435 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.368167 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.368144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b63f7e1ac759f67a8c9ef93a3b1d257-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal\" (UID: \"4b63f7e1ac759f67a8c9ef93a3b1d257\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.368257 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.368174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d7f4dde93369cc99ed3e326eef29a265-config\") pod \"kube-apiserver-proxy-ip-10-0-132-46.ec2.internal\" (UID: \"d7f4dde93369cc99ed3e326eef29a265\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.368257 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.368199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4b63f7e1ac759f67a8c9ef93a3b1d257-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal\" (UID: \"4b63f7e1ac759f67a8c9ef93a3b1d257\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.460463 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.460419 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.468776 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.468756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4b63f7e1ac759f67a8c9ef93a3b1d257-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal\" (UID: \"4b63f7e1ac759f67a8c9ef93a3b1d257\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.468852 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.468774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4b63f7e1ac759f67a8c9ef93a3b1d257-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal\" (UID: \"4b63f7e1ac759f67a8c9ef93a3b1d257\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.468852 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.468808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b63f7e1ac759f67a8c9ef93a3b1d257-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal\" (UID: \"4b63f7e1ac759f67a8c9ef93a3b1d257\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.468852 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.468841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d7f4dde93369cc99ed3e326eef29a265-config\") pod \"kube-apiserver-proxy-ip-10-0-132-46.ec2.internal\" (UID: \"d7f4dde93369cc99ed3e326eef29a265\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.468969 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.468875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b63f7e1ac759f67a8c9ef93a3b1d257-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal\" (UID: \"4b63f7e1ac759f67a8c9ef93a3b1d257\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.468969 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.468875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d7f4dde93369cc99ed3e326eef29a265-config\") pod \"kube-apiserver-proxy-ip-10-0-132-46.ec2.internal\" (UID: \"d7f4dde93369cc99ed3e326eef29a265\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.561199 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.561167 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.633635 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.633616 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.639084 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.639069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" Apr 21 10:03:56.662232 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.662207 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.762705 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.762650 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.863207 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.863177 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.963670 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:56.963645 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:56.965809 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.965793 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:56.965935 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.965920 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:56.965980 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:56.965954 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:57.051711 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.051663 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:56 +0000 UTC" deadline="2027-12-10 22:05:12.587223003 +0000 UTC" Apr 21 10:03:57.051870 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.051713 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14364h1m15.53551489s" Apr 21 10:03:57.063736 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:57.063711 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-46.ec2.internal\" not found" Apr 21 10:03:57.064810 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.064793 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:57.084846 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.084827 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:57.103784 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.103745 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zqwdc" Apr 21 10:03:57.113537 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.113521 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zqwdc" Apr 21 10:03:57.125120 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:57.125088 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7f4dde93369cc99ed3e326eef29a265.slice/crio-78cee4da9cf67b3a2667bb28b081404bb724d49339b8ebb5004efc70198f525b WatchSource:0}: Error finding container 78cee4da9cf67b3a2667bb28b081404bb724d49339b8ebb5004efc70198f525b: Status 404 returned error can't find the container with id 78cee4da9cf67b3a2667bb28b081404bb724d49339b8ebb5004efc70198f525b Apr 21 10:03:57.129235 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:57.129216 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b63f7e1ac759f67a8c9ef93a3b1d257.slice/crio-495430ca2c167f32cc2f355a2c780a5d1a8e5f423725681bf38e4f8d77867e83 WatchSource:0}: Error finding container 495430ca2c167f32cc2f355a2c780a5d1a8e5f423725681bf38e4f8d77867e83: Status 404 returned error can't find the container with id 495430ca2c167f32cc2f355a2c780a5d1a8e5f423725681bf38e4f8d77867e83 Apr 21 10:03:57.130711 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.130635 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:57.131031 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.131020 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:57.166196 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.166177 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" Apr 21 10:03:57.177080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.177063 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:57.179094 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.179083 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" Apr 21 10:03:57.187402 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.187377 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:57.202324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.202283 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" event={"ID":"4b63f7e1ac759f67a8c9ef93a3b1d257","Type":"ContainerStarted","Data":"495430ca2c167f32cc2f355a2c780a5d1a8e5f423725681bf38e4f8d77867e83"} Apr 21 10:03:57.203135 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.203116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" event={"ID":"d7f4dde93369cc99ed3e326eef29a265","Type":"ContainerStarted","Data":"78cee4da9cf67b3a2667bb28b081404bb724d49339b8ebb5004efc70198f525b"} Apr 21 10:03:57.362400 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.362325 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:57.868146 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:57.868119 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:58.041435 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.041405 2577 apiserver.go:52] "Watching apiserver" Apr 21 10:03:58.052444 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.052414 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:58.055129 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.055106 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-8l6xb","openshift-ovn-kubernetes/ovnkube-node-vswh9","kube-system/konnectivity-agent-cslsr","kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8","openshift-cluster-node-tuning-operator/tuned-x9n4l","openshift-dns/node-resolver-c9648","openshift-image-registry/node-ca-fznzv","openshift-multus/multus-additional-cni-plugins-wz5k6","openshift-network-diagnostics/network-check-target-mwlr4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal","openshift-multus/multus-b24s6","openshift-multus/network-metrics-daemon-7czdf"] Apr 21 10:03:58.056521 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.056503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:03:58.056603 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.056584 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:03:58.058027 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.058000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.060253 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.060233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.060540 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.060492 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:58.060968 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.060943 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:58.061152 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.061139 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.062096 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.062073 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.062385 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.062363 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-988zs\"" Apr 21 10:03:58.063106 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.063083 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:58.064419 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.064380 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:58.064511 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.064442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.064511 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.064463 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gqrt7\"" Apr 21 10:03:58.064620 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.064519 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.064682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.064670 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:58.065086 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.065063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:58.066228 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.066212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.066315 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.066301 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.066822 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.066799 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:58.066923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.066907 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.067195 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.067177 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5274v\"" Apr 21 10:03:58.067286 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.067256 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.067286 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.067262 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qrxk4\"" Apr 21 10:03:58.067670 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.067298 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.067670 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.067344 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:58.067670 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.067256 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.068152 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.068130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.068606 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.068591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.068686 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.068639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7n9xf\"" Apr 21 10:03:58.068871 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.068857 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:58.069268 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.069248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.069352 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.069339 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.069538 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.069522 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-27kw6\"" Apr 21 10:03:58.069618 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.069601 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.070605 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.070572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.070707 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.070687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.071101 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071085 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.071493 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071476 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:58.071591 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071502 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:58.071591 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071515 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:58.071591 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071549 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wwkdk\"" Apr 21 10:03:58.071838 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071821 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.071996 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.071977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.072085 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.072046 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:03:58.072846 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.072821 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:58.072956 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.072942 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:58.073278 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.073262 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tgn75\"" Apr 21 10:03:58.073362 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.073264 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:58.073362 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.073342 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:58.073488 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.073471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dbqqd\"" Apr 21 10:03:58.079288 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfcg\" (UniqueName: \"kubernetes.io/projected/398c1473-0683-4af4-866e-a4c6405244ff-kube-api-access-rnfcg\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.079383 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-kubelet\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079383 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-cni-bin\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079383 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1744533a-262f-4150-9f9b-9183b9e8576e-serviceca\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.079383 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/398c1473-0683-4af4-866e-a4c6405244ff-tmp-dir\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-slash\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-env-overrides\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-var-lib-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2qc\" (UniqueName: \"kubernetes.io/projected/e099e319-e542-43c2-9f97-e5b95d49e31d-kube-api-access-fw2qc\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-etc-selinux\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/398c1473-0683-4af4-866e-a4c6405244ff-hosts-file\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-systemd\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysctl-d\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-sys\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-var-lib-kubelet\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-sys-fs\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8k6f\" (UniqueName: \"kubernetes.io/projected/a36edaa6-3955-47d5-afca-2379e2c7cf39-kube-api-access-c8k6f\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-systemd-units\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-run\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxl7\" (UniqueName: \"kubernetes.io/projected/ecd398ca-3264-4609-b862-e4345b84ce0e-kube-api-access-vwxl7\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1744533a-262f-4150-9f9b-9183b9e8576e-host\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.079923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-node-log\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-cni-netd\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.079990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-ovn\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-socket-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-ovnkube-script-lib\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn7t9\" (UniqueName: \"kubernetes.io/projected/100383eb-b81b-458e-9697-d08a4606d57e-kube-api-access-qn7t9\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5qm7\" (UniqueName: \"kubernetes.io/projected/1744533a-262f-4150-9f9b-9183b9e8576e-kube-api-access-x5qm7\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080193 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-ovnkube-config\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysconfig\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-log-socket\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/100383eb-b81b-458e-9697-d08a4606d57e-ovn-node-metrics-cert\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-modprobe-d\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080304 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-kubernetes\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-lib-modules\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9a641cbb-c7b6-4574-b609-764377332512-agent-certs\") pod \"konnectivity-agent-cslsr\" (UID: \"9a641cbb-c7b6-4574-b609-764377332512\") " pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-cnibin\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080448 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-registration-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9a641cbb-c7b6-4574-b609-764377332512-konnectivity-ca\") pod \"konnectivity-agent-cslsr\" (UID: \"9a641cbb-c7b6-4574-b609-764377332512\") " pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-host\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-system-cni-dir\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-device-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecd398ca-3264-4609-b862-e4345b84ce0e-tmp\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-etc-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysctl-conf\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.080912 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-os-release\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.081671 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-run-netns\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.081671 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-systemd\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.081671 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.080817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-tuned\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.114120 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.114092 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:57 +0000 UTC" deadline="2028-01-31 20:22:05.287136978 +0000 UTC" Apr 21 10:03:58.114201 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.114119 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15610h18m7.173021295s" Apr 21 10:03:58.166768 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.166744 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:58.181035 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-cni-multus\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.181160 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-multus-certs\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.181160 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-ovn\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181160 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9f88\" (UniqueName: \"kubernetes.io/projected/1aa4f54e-36b5-40c5-8faa-641c649d50e7-kube-api-access-g9f88\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.181324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-ovn\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-hostroot\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.181324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.181324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.181324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.181324 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-socket-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-ovnkube-script-lib\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn7t9\" (UniqueName: \"kubernetes.io/projected/100383eb-b81b-458e-9697-d08a4606d57e-kube-api-access-qn7t9\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5qm7\" (UniqueName: \"kubernetes.io/projected/1744533a-262f-4150-9f9b-9183b9e8576e-kube-api-access-x5qm7\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-cni-bin\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-socket-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-ovnkube-config\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysconfig\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-log-socket\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/100383eb-b81b-458e-9697-d08a4606d57e-ovn-node-metrics-cert\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-modprobe-d\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-kubernetes\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.181616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-lib-modules\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysconfig\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-cni-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-socket-dir-parent\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9a641cbb-c7b6-4574-b609-764377332512-agent-certs\") pod \"konnectivity-agent-cslsr\" (UID: \"9a641cbb-c7b6-4574-b609-764377332512\") " pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1aa4f54e-36b5-40c5-8faa-641c649d50e7-iptables-alerter-script\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-modprobe-d\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-lib-modules\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-etc-kubernetes\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwf25\" (UniqueName: \"kubernetes.io/projected/2ae89f79-2df1-4414-b256-f90091f5fa3c-kube-api-access-xwf25\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-kubernetes\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-cnibin\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-registration-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.182217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-cnibin\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.181995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-log-socket\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-registration-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182036 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9a641cbb-c7b6-4574-b609-764377332512-konnectivity-ca\") pod \"konnectivity-agent-cslsr\" (UID: \"9a641cbb-c7b6-4574-b609-764377332512\") " pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-host\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-cnibin\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-ovnkube-script-lib\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-k8s-cni-cncf-io\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-host\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-kubelet\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-conf-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-system-cni-dir\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-device-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.183063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-system-cni-dir\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecd398ca-3264-4609-b862-e4345b84ce0e-tmp\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1aa4f54e-36b5-40c5-8faa-641c649d50e7-host-slash\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-device-dir\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-ovnkube-config\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-etc-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182420 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysctl-conf\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-etc-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-system-cni-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-os-release\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-run-netns\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-systemd\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-run-netns\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-tuned\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.183857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e099e319-e542-43c2-9f97-e5b95d49e31d-os-release\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysctl-conf\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-systemd\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfcg\" (UniqueName: \"kubernetes.io/projected/398c1473-0683-4af4-866e-a4c6405244ff-kube-api-access-rnfcg\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9a641cbb-c7b6-4574-b609-764377332512-konnectivity-ca\") pod \"konnectivity-agent-cslsr\" (UID: \"9a641cbb-c7b6-4574-b609-764377332512\") " pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-kubelet\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-kubelet\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-cni-bin\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1744533a-262f-4150-9f9b-9183b9e8576e-serviceca\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7l7\" (UniqueName: \"kubernetes.io/projected/3295da7d-67d3-49fe-887c-1205e6a605d5-kube-api-access-bq7l7\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-cni-bin\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/398c1473-0683-4af4-866e-a4c6405244ff-tmp-dir\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-slash\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-env-overrides\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182869 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-os-release\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-daemon-config\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-var-lib-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184497 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.182978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2qc\" (UniqueName: \"kubernetes.io/projected/e099e319-e542-43c2-9f97-e5b95d49e31d-kube-api-access-fw2qc\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-etc-selinux\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/398c1473-0683-4af4-866e-a4c6405244ff-hosts-file\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-systemd\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysctl-d\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-sys\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-var-lib-kubelet\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1744533a-262f-4150-9f9b-9183b9e8576e-serviceca\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-sys-fs\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8k6f\" (UniqueName: \"kubernetes.io/projected/a36edaa6-3955-47d5-afca-2379e2c7cf39-kube-api-access-c8k6f\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-systemd-units\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-run\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxl7\" (UniqueName: \"kubernetes.io/projected/ecd398ca-3264-4609-b862-e4345b84ce0e-kube-api-access-vwxl7\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-etc-selinux\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1744533a-262f-4150-9f9b-9183b9e8576e-host\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-slash\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.184955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-netns\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-node-log\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-cni-netd\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-var-lib-openvswitch\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/398c1473-0683-4af4-866e-a4c6405244ff-tmp-dir\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3295da7d-67d3-49fe-887c-1205e6a605d5-cni-binary-copy\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/398c1473-0683-4af4-866e-a4c6405244ff-hosts-file\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a36edaa6-3955-47d5-afca-2379e2c7cf39-sys-fs\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-run-systemd\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-sys\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-var-lib-kubelet\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-sysctl-d\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1744533a-262f-4150-9f9b-9183b9e8576e-host\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-systemd-units\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ecd398ca-3264-4609-b862-e4345b84ce0e-run\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.183911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/100383eb-b81b-458e-9697-d08a4606d57e-env-overrides\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.184139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-host-cni-netd\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.185461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.184164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e099e319-e542-43c2-9f97-e5b95d49e31d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.186068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.184155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/100383eb-b81b-458e-9697-d08a4606d57e-node-log\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.186068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.185584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ecd398ca-3264-4609-b862-e4345b84ce0e-etc-tuned\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.186068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.186003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/100383eb-b81b-458e-9697-d08a4606d57e-ovn-node-metrics-cert\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.186276 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.186253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecd398ca-3264-4609-b862-e4345b84ce0e-tmp\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.186337 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.186290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9a641cbb-c7b6-4574-b609-764377332512-agent-certs\") pod \"konnectivity-agent-cslsr\" (UID: \"9a641cbb-c7b6-4574-b609-764377332512\") " pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.189791 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.189768 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:58.189887 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.189795 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:58.189887 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.189810 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:58.189991 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.189927 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:03:58.68988775 +0000 UTC m=+3.084477018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:58.192212 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.192167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfcg\" (UniqueName: \"kubernetes.io/projected/398c1473-0683-4af4-866e-a4c6405244ff-kube-api-access-rnfcg\") pod \"node-resolver-c9648\" (UID: \"398c1473-0683-4af4-866e-a4c6405244ff\") " pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.192212 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.192184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8k6f\" (UniqueName: \"kubernetes.io/projected/a36edaa6-3955-47d5-afca-2379e2c7cf39-kube-api-access-c8k6f\") pod \"aws-ebs-csi-driver-node-2n7f8\" (UID: \"a36edaa6-3955-47d5-afca-2379e2c7cf39\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.193294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.193267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxl7\" (UniqueName: \"kubernetes.io/projected/ecd398ca-3264-4609-b862-e4345b84ce0e-kube-api-access-vwxl7\") pod \"tuned-x9n4l\" (UID: \"ecd398ca-3264-4609-b862-e4345b84ce0e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.193915 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.193873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn7t9\" (UniqueName: \"kubernetes.io/projected/100383eb-b81b-458e-9697-d08a4606d57e-kube-api-access-qn7t9\") pod \"ovnkube-node-vswh9\" (UID: \"100383eb-b81b-458e-9697-d08a4606d57e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.194075 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.194054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5qm7\" (UniqueName: \"kubernetes.io/projected/1744533a-262f-4150-9f9b-9183b9e8576e-kube-api-access-x5qm7\") pod \"node-ca-fznzv\" (UID: \"1744533a-262f-4150-9f9b-9183b9e8576e\") " pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.194560 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.194540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2qc\" (UniqueName: \"kubernetes.io/projected/e099e319-e542-43c2-9f97-e5b95d49e31d-kube-api-access-fw2qc\") pod \"multus-additional-cni-plugins-wz5k6\" (UID: \"e099e319-e542-43c2-9f97-e5b95d49e31d\") " pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.284761 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-netns\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3295da7d-67d3-49fe-887c-1205e6a605d5-cni-binary-copy\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-cni-multus\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-multus-certs\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-netns\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9f88\" (UniqueName: \"kubernetes.io/projected/1aa4f54e-36b5-40c5-8faa-641c649d50e7-kube-api-access-g9f88\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-hostroot\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-cni-bin\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.284907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-cni-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-cni-multus\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-socket-dir-parent\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-multus-certs\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1aa4f54e-36b5-40c5-8faa-641c649d50e7-iptables-alerter-script\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-cni-bin\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.284978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-etc-kubernetes\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-hostroot\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwf25\" (UniqueName: \"kubernetes.io/projected/2ae89f79-2df1-4414-b256-f90091f5fa3c-kube-api-access-xwf25\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-cnibin\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-k8s-cni-cncf-io\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-kubelet\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-cni-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-conf-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1aa4f54e-36b5-40c5-8faa-641c649d50e7-host-slash\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-system-cni-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7l7\" (UniqueName: \"kubernetes.io/projected/3295da7d-67d3-49fe-887c-1205e6a605d5-kube-api-access-bq7l7\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.285283 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-var-lib-kubelet\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-os-release\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.285277 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-daemon-config\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-os-release\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1aa4f54e-36b5-40c5-8faa-641c649d50e7-host-slash\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.285348 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:03:58.785330481 +0000 UTC m=+3.179919733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-cnibin\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-system-cni-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-socket-dir-parent\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3295da7d-67d3-49fe-887c-1205e6a605d5-cni-binary-copy\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1aa4f54e-36b5-40c5-8faa-641c649d50e7-iptables-alerter-script\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-host-run-k8s-cni-cncf-io\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-etc-kubernetes\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-conf-dir\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.286133 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.285841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3295da7d-67d3-49fe-887c-1205e6a605d5-multus-daemon-config\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.296037 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.296015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9f88\" (UniqueName: \"kubernetes.io/projected/1aa4f54e-36b5-40c5-8faa-641c649d50e7-kube-api-access-g9f88\") pod \"iptables-alerter-8l6xb\" (UID: \"1aa4f54e-36b5-40c5-8faa-641c649d50e7\") " pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.296281 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.296265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwf25\" (UniqueName: \"kubernetes.io/projected/2ae89f79-2df1-4414-b256-f90091f5fa3c-kube-api-access-xwf25\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.324138 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.324114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7l7\" (UniqueName: \"kubernetes.io/projected/3295da7d-67d3-49fe-887c-1205e6a605d5-kube-api-access-bq7l7\") pod \"multus-b24s6\" (UID: \"3295da7d-67d3-49fe-887c-1205e6a605d5\") " pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.371259 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.371230 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:03:58.377979 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.377927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:03:58.384711 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.384691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" Apr 21 10:03:58.391709 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.391689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c9648" Apr 21 10:03:58.397857 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.397834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" Apr 21 10:03:58.405342 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.405325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fznzv" Apr 21 10:03:58.411952 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.411936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" Apr 21 10:03:58.419546 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.419531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8l6xb" Apr 21 10:03:58.425111 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.425094 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b24s6" Apr 21 10:03:58.711042 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.711012 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa4f54e_36b5_40c5_8faa_641c649d50e7.slice/crio-76eb6b541ce23f1d2e94f8c251ecbf1bae6dd198351059962993411e1c72a75f WatchSource:0}: Error finding container 76eb6b541ce23f1d2e94f8c251ecbf1bae6dd198351059962993411e1c72a75f: Status 404 returned error can't find the container with id 76eb6b541ce23f1d2e94f8c251ecbf1bae6dd198351059962993411e1c72a75f Apr 21 10:03:58.712257 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.712239 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100383eb_b81b_458e_9697_d08a4606d57e.slice/crio-fe6446388d0a38969f7fd17f01f0c082b913a267bdb2ea6812ea05d419fa74d5 WatchSource:0}: Error finding container fe6446388d0a38969f7fd17f01f0c082b913a267bdb2ea6812ea05d419fa74d5: Status 404 returned error can't find the container with id fe6446388d0a38969f7fd17f01f0c082b913a267bdb2ea6812ea05d419fa74d5 Apr 21 10:03:58.713112 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.713075 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3295da7d_67d3_49fe_887c_1205e6a605d5.slice/crio-f7cf886ea4fc01337baac31a5a996555cb9e06efd65dceb1ed645236d4072f40 WatchSource:0}: Error finding container f7cf886ea4fc01337baac31a5a996555cb9e06efd65dceb1ed645236d4072f40: Status 404 returned error can't find the container with id f7cf886ea4fc01337baac31a5a996555cb9e06efd65dceb1ed645236d4072f40 Apr 21 10:03:58.714841 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.714372 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398c1473_0683_4af4_866e_a4c6405244ff.slice/crio-7c88d61a0a55b90453c756f20a589c6425f3251a6a4264f0d4566122d99aed34 WatchSource:0}: Error finding container 7c88d61a0a55b90453c756f20a589c6425f3251a6a4264f0d4566122d99aed34: Status 404 returned error can't find the container with id 7c88d61a0a55b90453c756f20a589c6425f3251a6a4264f0d4566122d99aed34 Apr 21 10:03:58.716470 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.716416 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode099e319_e542_43c2_9f97_e5b95d49e31d.slice/crio-75f99eb4b251970d7f620a7dc18d2f8e8233b2a9daed987db8959eec4b91a43e WatchSource:0}: Error finding container 75f99eb4b251970d7f620a7dc18d2f8e8233b2a9daed987db8959eec4b91a43e: Status 404 returned error can't find the container with id 75f99eb4b251970d7f620a7dc18d2f8e8233b2a9daed987db8959eec4b91a43e Apr 21 10:03:58.717318 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.717292 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a641cbb_c7b6_4574_b609_764377332512.slice/crio-90306ca23fb734e60f13792b9dcc63bf930dbde6487b4489cb7969075fe1af7b WatchSource:0}: Error finding container 90306ca23fb734e60f13792b9dcc63bf930dbde6487b4489cb7969075fe1af7b: Status 404 returned error can't find the container with id 90306ca23fb734e60f13792b9dcc63bf930dbde6487b4489cb7969075fe1af7b Apr 21 10:03:58.718045 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.717970 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd398ca_3264_4609_b862_e4345b84ce0e.slice/crio-916e555e146b85110092c3ec2160fa1d270008451bbe831e5f48ed4a81846341 WatchSource:0}: Error finding container 916e555e146b85110092c3ec2160fa1d270008451bbe831e5f48ed4a81846341: Status 404 returned error can't find the container with id 916e555e146b85110092c3ec2160fa1d270008451bbe831e5f48ed4a81846341 Apr 21 10:03:58.720652 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.720247 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda36edaa6_3955_47d5_afca_2379e2c7cf39.slice/crio-7f22da9ea72f3932dfefc0725a01f57110d6e471e6579a4dc3462dbfb3b06cab WatchSource:0}: Error finding container 7f22da9ea72f3932dfefc0725a01f57110d6e471e6579a4dc3462dbfb3b06cab: Status 404 returned error can't find the container with id 7f22da9ea72f3932dfefc0725a01f57110d6e471e6579a4dc3462dbfb3b06cab Apr 21 10:03:58.720814 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:03:58.720795 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1744533a_262f_4150_9f9b_9183b9e8576e.slice/crio-b93db820784abf030cfbcb00e1dd797e8f270add5a1ab0431080f928e9aafbd8 WatchSource:0}: Error finding container b93db820784abf030cfbcb00e1dd797e8f270add5a1ab0431080f928e9aafbd8: Status 404 returned error can't find the container with id b93db820784abf030cfbcb00e1dd797e8f270add5a1ab0431080f928e9aafbd8 Apr 21 10:03:58.789150 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.789129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:03:58.789244 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:58.789170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:58.789285 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.789258 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:58.789322 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.789298 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:58.789353 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.789321 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:58.789353 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.789334 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:58.789450 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.789309 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:03:59.789292672 +0000 UTC m=+4.183881921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:58.789450 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:58.789439 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:03:59.789420744 +0000 UTC m=+4.184009995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:59.114870 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.114746 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:57 +0000 UTC" deadline="2027-12-29 01:35:47.904751124 +0000 UTC" Apr 21 10:03:59.114870 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.114781 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14799h31m48.789974115s" Apr 21 10:03:59.200103 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.200068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:59.200271 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.200189 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:03:59.217232 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.217197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cslsr" event={"ID":"9a641cbb-c7b6-4574-b609-764377332512","Type":"ContainerStarted","Data":"90306ca23fb734e60f13792b9dcc63bf930dbde6487b4489cb7969075fe1af7b"} Apr 21 10:03:59.224808 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.224762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerStarted","Data":"75f99eb4b251970d7f620a7dc18d2f8e8233b2a9daed987db8959eec4b91a43e"} Apr 21 10:03:59.235977 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.235944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c9648" event={"ID":"398c1473-0683-4af4-866e-a4c6405244ff","Type":"ContainerStarted","Data":"7c88d61a0a55b90453c756f20a589c6425f3251a6a4264f0d4566122d99aed34"} Apr 21 10:03:59.252378 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.252310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"fe6446388d0a38969f7fd17f01f0c082b913a267bdb2ea6812ea05d419fa74d5"} Apr 21 10:03:59.256440 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.256377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fznzv" event={"ID":"1744533a-262f-4150-9f9b-9183b9e8576e","Type":"ContainerStarted","Data":"b93db820784abf030cfbcb00e1dd797e8f270add5a1ab0431080f928e9aafbd8"} Apr 21 10:03:59.259652 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.259625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" event={"ID":"ecd398ca-3264-4609-b862-e4345b84ce0e","Type":"ContainerStarted","Data":"916e555e146b85110092c3ec2160fa1d270008451bbe831e5f48ed4a81846341"} Apr 21 10:03:59.270086 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.270040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b24s6" event={"ID":"3295da7d-67d3-49fe-887c-1205e6a605d5","Type":"ContainerStarted","Data":"f7cf886ea4fc01337baac31a5a996555cb9e06efd65dceb1ed645236d4072f40"} Apr 21 10:03:59.273342 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.273319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8l6xb" event={"ID":"1aa4f54e-36b5-40c5-8faa-641c649d50e7","Type":"ContainerStarted","Data":"76eb6b541ce23f1d2e94f8c251ecbf1bae6dd198351059962993411e1c72a75f"} Apr 21 10:03:59.284580 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.284556 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" event={"ID":"d7f4dde93369cc99ed3e326eef29a265","Type":"ContainerStarted","Data":"c878047d3fc756594de336ce5db3804928f1bc45c57c3e68d8bab04075c55931"} Apr 21 10:03:59.291169 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.291113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" event={"ID":"a36edaa6-3955-47d5-afca-2379e2c7cf39","Type":"ContainerStarted","Data":"7f22da9ea72f3932dfefc0725a01f57110d6e471e6579a4dc3462dbfb3b06cab"} Apr 21 10:03:59.801541 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.801348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:03:59.801541 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:03:59.801435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:03:59.801541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.801523 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:59.801784 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.801578 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:59.801784 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.801594 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:59.801784 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.801601 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:01.801578944 +0000 UTC m=+6.196168204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:59.801784 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.801607 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:59.801784 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:03:59.801657 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:04:01.801641261 +0000 UTC m=+6.196230512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:00.202531 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:00.202500 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:00.202996 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:00.202619 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:00.316330 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:00.316294 2577 generic.go:358] "Generic (PLEG): container finished" podID="4b63f7e1ac759f67a8c9ef93a3b1d257" containerID="b9f8c30c1a25808dc8c840cc7903a060dc6c785559f65773e68be476967ccb64" exitCode=0 Apr 21 10:04:00.316524 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:00.316452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" event={"ID":"4b63f7e1ac759f67a8c9ef93a3b1d257","Type":"ContainerDied","Data":"b9f8c30c1a25808dc8c840cc7903a060dc6c785559f65773e68be476967ccb64"} Apr 21 10:04:00.330996 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:00.330947 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-46.ec2.internal" podStartSLOduration=3.330930668 podStartE2EDuration="3.330930668s" podCreationTimestamp="2026-04-21 10:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:59.300087622 +0000 UTC m=+3.694676894" watchObservedRunningTime="2026-04-21 10:04:00.330930668 +0000 UTC m=+4.725519942" Apr 21 10:04:01.199509 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:01.199480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:01.199677 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.199582 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:01.325449 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:01.325134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" event={"ID":"4b63f7e1ac759f67a8c9ef93a3b1d257","Type":"ContainerStarted","Data":"0a19c8acf8bfeb62a6449a64f207abe5c3acc8cb4b5b68bc1fc620dfa55d6424"} Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:01.817563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:01.817630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.817754 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.817817 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:05.817798043 +0000 UTC m=+10.212387311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.817897 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.817910 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.817922 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:01.818288 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:01.817957 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:04:05.81794571 +0000 UTC m=+10.212534967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:02.208786 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:02.208572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:02.208786 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:02.208757 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:03.200210 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:03.200180 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:03.200584 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:03.200315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:04.201421 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:04.201376 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:04.201766 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:04.201487 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:05.200161 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:05.200119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:05.200346 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.200275 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:05.853137 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:05.853099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:05.853174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.853265 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.853322 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.853341 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.853350 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.853412 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:13.853372501 +0000 UTC m=+18.247961759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:05.853541 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:05.853435 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:04:13.853424382 +0000 UTC m=+18.248013633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:06.200176 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:06.200145 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:06.200315 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:06.200229 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:07.200060 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:07.200029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:07.200478 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:07.200148 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:08.199826 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:08.199792 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:08.200000 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:08.199920 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:09.199557 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:09.199527 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:09.200026 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:09.199661 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:10.199489 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:10.199456 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:10.199669 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:10.199564 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:11.199330 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:11.199292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:11.199526 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:11.199442 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:12.199481 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:12.199447 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:12.199926 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:12.199572 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:13.199879 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:13.199842 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:13.200341 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.199992 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:13.905805 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:13.905770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:13.905972 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:13.905835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:13.905972 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.905947 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:13.905972 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.905968 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:13.906091 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.905984 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:13.906091 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.905997 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:13.906091 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.906013 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:29.905992707 +0000 UTC m=+34.300581961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:13.906091 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:13.906029 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:04:29.906020237 +0000 UTC m=+34.300609486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:14.199326 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:14.199295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:14.199498 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:14.199429 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:15.199426 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:15.199379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:15.199822 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:15.199515 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:16.200681 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.200540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:16.200982 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:16.200734 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:16.350147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.350120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" event={"ID":"a36edaa6-3955-47d5-afca-2379e2c7cf39","Type":"ContainerStarted","Data":"a0922ff4c2aff1cdd5944e56a3927f336f31331623dd5f4732b31484487f5665"} Apr 21 10:04:16.351203 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.351180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cslsr" event={"ID":"9a641cbb-c7b6-4574-b609-764377332512","Type":"ContainerStarted","Data":"25b15a7a99b33f34f3456a5840e437213322e81c11131a2980deaf2ac0ebd082"} Apr 21 10:04:16.352424 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.352379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerStarted","Data":"ae8701250352703a5dc95272cf906205ed1bcd058283d325afc485f57cd1f330"} Apr 21 10:04:16.353655 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.353624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c9648" event={"ID":"398c1473-0683-4af4-866e-a4c6405244ff","Type":"ContainerStarted","Data":"3969faa37f31062dcda5d9b2f3a220d887d079a80d4a23a2464a0b7b75bb91f4"} Apr 21 10:04:16.355338 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.355316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"d79045153b6eae6e0707e3d152723b2fb16e1d707b4833c20e0125133fa4fecc"} Apr 21 10:04:16.355450 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.355343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"aa2f17fe8cf1997dca829815db9b240d62736829e5a3e15a44526f0821f95f21"} Apr 21 10:04:16.356652 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.356632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fznzv" event={"ID":"1744533a-262f-4150-9f9b-9183b9e8576e","Type":"ContainerStarted","Data":"52c5585305868d408cf06ad5cec1dbe62e9ae14e95b6c09d8bcfe806650b7505"} Apr 21 10:04:16.357763 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.357744 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" event={"ID":"ecd398ca-3264-4609-b862-e4345b84ce0e","Type":"ContainerStarted","Data":"9a35523bc468f85140827324bf30a5f42c397f3cd1ce4ffbbaf2eeb846da0285"} Apr 21 10:04:16.358897 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.358878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b24s6" event={"ID":"3295da7d-67d3-49fe-887c-1205e6a605d5","Type":"ContainerStarted","Data":"e4b3e2aa3819e1cebf9f0d321099512ba15c9dff621ce55a8c24fe9259754cef"} Apr 21 10:04:16.368063 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.368018 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-46.ec2.internal" podStartSLOduration=19.368004728 podStartE2EDuration="19.368004728s" podCreationTimestamp="2026-04-21 10:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:04:01.338861654 +0000 UTC m=+5.733451044" watchObservedRunningTime="2026-04-21 10:04:16.368004728 +0000 UTC m=+20.762594002" Apr 21 10:04:16.368613 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.368575 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cslsr" podStartSLOduration=3.170955191 podStartE2EDuration="20.368564308s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.719504008 +0000 UTC m=+3.114093261" lastFinishedPulling="2026-04-21 10:04:15.917113115 +0000 UTC m=+20.311702378" observedRunningTime="2026-04-21 10:04:16.368149089 +0000 UTC m=+20.762738353" watchObservedRunningTime="2026-04-21 10:04:16.368564308 +0000 UTC m=+20.763153579" Apr 21 10:04:16.388125 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.388068 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b24s6" podStartSLOduration=3.166896886 podStartE2EDuration="20.388054849s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.71553906 +0000 UTC m=+3.110128323" lastFinishedPulling="2026-04-21 10:04:15.936697022 +0000 UTC m=+20.331286286" observedRunningTime="2026-04-21 10:04:16.387501309 +0000 UTC m=+20.782090583" watchObservedRunningTime="2026-04-21 10:04:16.388054849 +0000 UTC m=+20.782644119" Apr 21 10:04:16.407420 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.407359 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c9648" podStartSLOduration=3.228890484 podStartE2EDuration="20.407344423s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.716095369 +0000 UTC m=+3.110684617" lastFinishedPulling="2026-04-21 10:04:15.8945493 +0000 UTC m=+20.289138556" observedRunningTime="2026-04-21 10:04:16.406489902 +0000 UTC m=+20.801079176" watchObservedRunningTime="2026-04-21 10:04:16.407344423 +0000 UTC m=+20.801933693" Apr 21 10:04:16.435637 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.435483 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x9n4l" podStartSLOduration=3.23700235 podStartE2EDuration="20.435468626s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.720119899 +0000 UTC m=+3.114709149" lastFinishedPulling="2026-04-21 10:04:15.918586171 +0000 UTC m=+20.313175425" observedRunningTime="2026-04-21 10:04:16.435127104 +0000 UTC m=+20.829716374" watchObservedRunningTime="2026-04-21 10:04:16.435468626 +0000 UTC m=+20.830057897" Apr 21 10:04:16.457956 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:16.457869 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fznzv" podStartSLOduration=11.279669252 podStartE2EDuration="20.457849917s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.722507192 +0000 UTC m=+3.117096441" lastFinishedPulling="2026-04-21 10:04:07.900687854 +0000 UTC m=+12.295277106" observedRunningTime="2026-04-21 10:04:16.457610655 +0000 UTC m=+20.852199924" watchObservedRunningTime="2026-04-21 10:04:16.457849917 +0000 UTC m=+20.852439188" Apr 21 10:04:17.199310 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.199275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:17.199484 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:17.199434 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:17.362128 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.362091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8l6xb" event={"ID":"1aa4f54e-36b5-40c5-8faa-641c649d50e7","Type":"ContainerStarted","Data":"0788063fb035e38917cf419c67dc3a8f94623b74af1fbe57ab7fac2e3387a140"} Apr 21 10:04:17.363404 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.363369 2577 generic.go:358] "Generic (PLEG): container finished" podID="e099e319-e542-43c2-9f97-e5b95d49e31d" containerID="ae8701250352703a5dc95272cf906205ed1bcd058283d325afc485f57cd1f330" exitCode=0 Apr 21 10:04:17.363480 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.363456 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerDied","Data":"ae8701250352703a5dc95272cf906205ed1bcd058283d325afc485f57cd1f330"} Apr 21 10:04:17.365615 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.365597 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:04:17.365873 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.365853 2577 generic.go:358] "Generic (PLEG): container finished" podID="100383eb-b81b-458e-9697-d08a4606d57e" containerID="d79045153b6eae6e0707e3d152723b2fb16e1d707b4833c20e0125133fa4fecc" exitCode=1 Apr 21 10:04:17.365956 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.365937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerDied","Data":"d79045153b6eae6e0707e3d152723b2fb16e1d707b4833c20e0125133fa4fecc"} Apr 21 10:04:17.366006 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.365969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"25995197a9e9a50d9e903b81ea45083a0a270fac02f79781b5cb9c8c49fb2bcd"} Apr 21 10:04:17.366006 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.365984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"d8d7a9e7baebbc1afe2c6442d752efabd2a9d27e3a6794d8bb3f8edd38dd6501"} Apr 21 10:04:17.366006 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.365996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"54ad9cdcc14dcc4bc4354194577200a5746566ca428077afcd23b81d00c48774"} Apr 21 10:04:17.366090 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.366007 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"17803db11ef48a9f64eda23962563aab6b6590334cd7fbd6b1fb1b330d88bc0b"} Apr 21 10:04:17.380349 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.380315 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8l6xb" podStartSLOduration=4.176110006 podStartE2EDuration="21.380304429s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.712949064 +0000 UTC m=+3.107538316" lastFinishedPulling="2026-04-21 10:04:15.917143483 +0000 UTC m=+20.311732739" observedRunningTime="2026-04-21 10:04:17.380061655 +0000 UTC m=+21.774650925" watchObservedRunningTime="2026-04-21 10:04:17.380304429 +0000 UTC m=+21.774893703" Apr 21 10:04:17.539612 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:17.539584 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:18.144226 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.144031 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:17.539608305Z","UUID":"9b63c1a9-ec17-4433-9fa3-a653b42d8421","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:18.147535 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.147511 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:18.147667 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.147545 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:18.200177 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.200150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:18.200342 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:18.200282 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:18.369282 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.369242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" event={"ID":"a36edaa6-3955-47d5-afca-2379e2c7cf39","Type":"ContainerStarted","Data":"fe869d8c74f55af4cb392d0e05e5b214bf7d916504666e1e9e5ecd69f26c2144"} Apr 21 10:04:18.450868 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.450837 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:04:18.451506 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:18.451488 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:04:19.199862 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:19.199625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:19.200037 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:19.199983 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:19.373801 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:19.373774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:04:19.374298 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:19.374270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"1607ec0f212f0a64dceea4cd62c6983ba713da8904bd3efa2bee1995bdeaf915"} Apr 21 10:04:19.376320 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:19.376280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" event={"ID":"a36edaa6-3955-47d5-afca-2379e2c7cf39","Type":"ContainerStarted","Data":"c6b3aa98416b2d7ff52dadcdd0f58fa5dd1a48b65ddf665be4479535d9238332"} Apr 21 10:04:19.407626 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:19.407580 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2n7f8" podStartSLOduration=3.359264784 podStartE2EDuration="23.407569379s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.721419255 +0000 UTC m=+3.116008511" lastFinishedPulling="2026-04-21 10:04:18.769723855 +0000 UTC m=+23.164313106" observedRunningTime="2026-04-21 10:04:19.407265005 +0000 UTC m=+23.801854289" watchObservedRunningTime="2026-04-21 10:04:19.407569379 +0000 UTC m=+23.802158650" Apr 21 10:04:20.200125 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:20.200090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:20.200313 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:20.200232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:21.199230 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.199208 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:21.199726 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:21.199326 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:21.382969 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.382751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerStarted","Data":"c26f4fbabf654b02d11f231d1aa82cf5c4179ca56c880a70207bbe72891217ba"} Apr 21 10:04:21.385973 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.385951 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:04:21.386275 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.386243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"22c77943c9af30fb48cfd39859ce67881bcbdd90bad77e2532f03368b243c3f9"} Apr 21 10:04:21.386543 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.386525 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:04:21.386594 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.386554 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:04:21.386677 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.386663 2577 scope.go:117] "RemoveContainer" containerID="d79045153b6eae6e0707e3d152723b2fb16e1d707b4833c20e0125133fa4fecc" Apr 21 10:04:21.401786 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:21.401765 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:04:22.200081 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.200046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:22.200381 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:22.200186 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:22.395382 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.395339 2577 generic.go:358] "Generic (PLEG): container finished" podID="e099e319-e542-43c2-9f97-e5b95d49e31d" containerID="c26f4fbabf654b02d11f231d1aa82cf5c4179ca56c880a70207bbe72891217ba" exitCode=0 Apr 21 10:04:22.395382 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.395374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerDied","Data":"c26f4fbabf654b02d11f231d1aa82cf5c4179ca56c880a70207bbe72891217ba"} Apr 21 10:04:22.399068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.399049 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:04:22.399457 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.399430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" event={"ID":"100383eb-b81b-458e-9697-d08a4606d57e","Type":"ContainerStarted","Data":"1c87495964c472bb6f41a56b74473d6aa64d44caf7f12dcb3a17ee191ed05382"} Apr 21 10:04:22.400600 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.399867 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:04:22.421018 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.420982 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:04:22.469855 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:22.468840 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" podStartSLOduration=9.205448737 podStartE2EDuration="26.468823289s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.714694744 +0000 UTC m=+3.109284008" lastFinishedPulling="2026-04-21 10:04:15.978069293 +0000 UTC m=+20.372658560" observedRunningTime="2026-04-21 10:04:22.466418463 +0000 UTC m=+26.861007736" watchObservedRunningTime="2026-04-21 10:04:22.468823289 +0000 UTC m=+26.863412552" Apr 21 10:04:23.156448 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:23.156419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mwlr4"] Apr 21 10:04:23.156581 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:23.156523 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:23.156649 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:23.156610 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:23.161985 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:23.161960 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7czdf"] Apr 21 10:04:23.162096 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:23.162086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:23.162202 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:23.162182 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:23.402802 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:23.402758 2577 generic.go:358] "Generic (PLEG): container finished" podID="e099e319-e542-43c2-9f97-e5b95d49e31d" containerID="8523808fafe2d1241255e37d34a0e9c34830d06992836ffee6f925aab6dc5666" exitCode=0 Apr 21 10:04:23.403177 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:23.402850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerDied","Data":"8523808fafe2d1241255e37d34a0e9c34830d06992836ffee6f925aab6dc5666"} Apr 21 10:04:24.353507 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:24.353249 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:04:24.353629 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:24.353541 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 10:04:24.353864 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:24.353844 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cslsr" Apr 21 10:04:24.406972 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:24.406945 2577 generic.go:358] "Generic (PLEG): container finished" podID="e099e319-e542-43c2-9f97-e5b95d49e31d" containerID="5f1980f5d2cc1de7ae08e4418c869737c19f6fa8b4e3ca16b2c07bc03dff552b" exitCode=0 Apr 21 10:04:24.407345 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:24.407026 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerDied","Data":"5f1980f5d2cc1de7ae08e4418c869737c19f6fa8b4e3ca16b2c07bc03dff552b"} Apr 21 10:04:25.200142 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:25.200113 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:25.200255 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:25.200112 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:25.200305 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:25.200252 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:25.200349 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:25.200301 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:27.199269 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:27.199234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:27.199879 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:27.199234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:27.199879 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:27.199350 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:27.199879 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:27.199467 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:29.199460 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.199427 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:29.199984 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.199553 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:04:29.199984 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.199615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:29.199984 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.199722 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mwlr4" podUID="fe7a5351-0fba-4368-9b98-1791bc7cfdfc" Apr 21 10:04:29.892029 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.891998 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-46.ec2.internal" event="NodeReady" Apr 21 10:04:29.892197 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.892149 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:29.924112 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.924083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:29.924277 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.924140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:29.924277 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.924252 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:29.924277 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.924265 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:29.924458 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.924280 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:29.924458 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.924293 2577 projected.go:194] Error preparing data for projected volume kube-api-access-tqg5l for pod openshift-network-diagnostics/network-check-target-mwlr4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:29.924458 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.924313 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:05:01.924294623 +0000 UTC m=+66.318883886 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:29.924458 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:29.924329 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l podName:fe7a5351-0fba-4368-9b98-1791bc7cfdfc nodeName:}" failed. No retries permitted until 2026-04-21 10:05:01.92431874 +0000 UTC m=+66.318907990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tqg5l" (UniqueName: "kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l") pod "network-check-target-mwlr4" (UID: "fe7a5351-0fba-4368-9b98-1791bc7cfdfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:29.940751 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.940721 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g62bp"] Apr 21 10:04:29.974293 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.974246 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p5cg2"] Apr 21 10:04:29.974470 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.974432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:29.977315 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.977291 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:29.978095 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.977531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-g7j6z\"" Apr 21 10:04:29.978095 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.977728 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:29.989612 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.989589 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g62bp"] Apr 21 10:04:29.989612 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.989609 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p5cg2"] Apr 21 10:04:29.989728 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.989690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:29.992378 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.992358 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:29.992484 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.992413 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9st8k\"" Apr 21 10:04:29.992546 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.992530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:29.993326 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:29.993153 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:30.125328 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.125297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqqx2\" (UniqueName: \"kubernetes.io/projected/45c00ff2-e16b-4854-9279-a0a6d25f59c8-kube-api-access-rqqx2\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.125499 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.125347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.125499 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.125366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/45c00ff2-e16b-4854-9279-a0a6d25f59c8-tmp-dir\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.125499 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.125409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbbv\" (UniqueName: \"kubernetes.io/projected/f34ed386-407f-400b-a309-9c15bf12db74-kube-api-access-4wbbv\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:30.125499 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.125437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c00ff2-e16b-4854-9279-a0a6d25f59c8-config-volume\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.125652 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.125506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:30.226561 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:30.226561 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqqx2\" (UniqueName: \"kubernetes.io/projected/45c00ff2-e16b-4854-9279-a0a6d25f59c8-kube-api-access-rqqx2\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/45c00ff2-e16b-4854-9279-a0a6d25f59c8-tmp-dir\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbbv\" (UniqueName: \"kubernetes.io/projected/f34ed386-407f-400b-a309-9c15bf12db74-kube-api-access-4wbbv\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c00ff2-e16b-4854-9279-a0a6d25f59c8-config-volume\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.226642 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.226703 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:30.726685388 +0000 UTC m=+35.121274636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.226720 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.226789 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:30.72677331 +0000 UTC m=+35.121362566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:04:30.227092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.226924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/45c00ff2-e16b-4854-9279-a0a6d25f59c8-tmp-dir\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.227368 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.227152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45c00ff2-e16b-4854-9279-a0a6d25f59c8-config-volume\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.244725 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.244707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqqx2\" (UniqueName: \"kubernetes.io/projected/45c00ff2-e16b-4854-9279-a0a6d25f59c8-kube-api-access-rqqx2\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.244816 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.244780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbbv\" (UniqueName: \"kubernetes.io/projected/f34ed386-407f-400b-a309-9c15bf12db74-kube-api-access-4wbbv\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:30.729905 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.729720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:30.730000 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:30.729942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:30.730000 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.729859 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:30.730067 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.730024 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:31.730008191 +0000 UTC m=+36.124597440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:04:30.730067 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.730051 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:30.730133 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:30.730102 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:31.730088343 +0000 UTC m=+36.124677592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:04:31.199981 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.199946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:04:31.200175 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.199949 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:04:31.203865 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.203842 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:31.203990 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.203878 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:31.203990 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.203845 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:31.203990 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.203845 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bjzzq\"" Apr 21 10:04:31.203990 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.203851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7zbjr\"" Apr 21 10:04:31.421888 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.421856 2577 generic.go:358] "Generic (PLEG): container finished" podID="e099e319-e542-43c2-9f97-e5b95d49e31d" containerID="414df351fe4ea96a0a7f90c43240e85de601958114da901a54599709ac1a2dd2" exitCode=0 Apr 21 10:04:31.422294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.421894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerDied","Data":"414df351fe4ea96a0a7f90c43240e85de601958114da901a54599709ac1a2dd2"} Apr 21 10:04:31.737671 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.737581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:31.737671 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:31.737630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:31.737817 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:31.737723 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:31.737817 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:31.737730 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:31.737817 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:31.737792 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:33.737774467 +0000 UTC m=+38.132363718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:04:31.737817 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:31.737808 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:33.737800916 +0000 UTC m=+38.132390165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:04:32.426007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:32.425964 2577 generic.go:358] "Generic (PLEG): container finished" podID="e099e319-e542-43c2-9f97-e5b95d49e31d" containerID="a6f5e3ebfc0da3c159b69bd34fbd56158688a74dd0d21cb1d8a36577f3806c3a" exitCode=0 Apr 21 10:04:32.426343 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:32.426027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerDied","Data":"a6f5e3ebfc0da3c159b69bd34fbd56158688a74dd0d21cb1d8a36577f3806c3a"} Apr 21 10:04:33.431235 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:33.431201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" event={"ID":"e099e319-e542-43c2-9f97-e5b95d49e31d","Type":"ContainerStarted","Data":"6a38f6e1909209aa6ad5192f0949a79b8e554b319c72d446e328cc02c84ee8e7"} Apr 21 10:04:33.461865 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:33.461823 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wz5k6" podStartSLOduration=5.758415047 podStartE2EDuration="37.461810436s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:03:58.718362483 +0000 UTC m=+3.112951733" lastFinishedPulling="2026-04-21 10:04:30.421757874 +0000 UTC m=+34.816347122" observedRunningTime="2026-04-21 10:04:33.460417393 +0000 UTC m=+37.855006663" watchObservedRunningTime="2026-04-21 10:04:33.461810436 +0000 UTC m=+37.856399706" Apr 21 10:04:33.752611 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:33.752535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:33.752770 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:33.752611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:33.752770 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:33.752700 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:33.752770 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:33.752726 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:33.752770 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:33.752767 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:37.75274935 +0000 UTC m=+42.147338619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:04:33.752923 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:33.752784 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:37.752777546 +0000 UTC m=+42.147366794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:04:37.775844 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:37.775797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:37.775844 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:37.775860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:37.776332 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:37.775938 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:37.776332 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:37.775952 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:37.776332 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:37.776004 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:45.775988026 +0000 UTC m=+50.170577276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:04:37.776332 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:37.776018 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:45.776011652 +0000 UTC m=+50.170600901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:04:45.824936 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:45.824896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:04:45.825555 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:45.824969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:04:45.825555 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:45.825057 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:45.825555 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:45.825086 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:45.825555 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:45.825126 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:01.825107478 +0000 UTC m=+66.219696730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:04:45.825555 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:04:45.825149 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:01.825133882 +0000 UTC m=+66.219723132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:04:54.418496 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:04:54.418464 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vswh9" Apr 21 10:05:01.832046 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.832006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:05:01.832496 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.832069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:05:01.832496 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:01.832151 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:01.832496 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:01.832152 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:01.832496 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:01.832204 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:33.832191015 +0000 UTC m=+98.226780265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:05:01.832496 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:01.832226 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:33.832218727 +0000 UTC m=+98.226807976 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:05:01.932651 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.932614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:05:01.932809 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.932703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:05:01.935499 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.935478 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:05:01.935564 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.935504 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:05:01.943789 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:01.943772 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:05:01.943854 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:01.943822 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:06:05.943807727 +0000 UTC m=+130.338396976 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : secret "metrics-daemon-secret" not found Apr 21 10:05:01.946176 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.946156 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:05:01.957609 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:01.957585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqg5l\" (UniqueName: \"kubernetes.io/projected/fe7a5351-0fba-4368-9b98-1791bc7cfdfc-kube-api-access-tqg5l\") pod \"network-check-target-mwlr4\" (UID: \"fe7a5351-0fba-4368-9b98-1791bc7cfdfc\") " pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:05:02.112983 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:02.112921 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bjzzq\"" Apr 21 10:05:02.121042 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:02.121023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:05:02.247177 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:02.247152 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mwlr4"] Apr 21 10:05:02.250666 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:05:02.250633 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe7a5351_0fba_4368_9b98_1791bc7cfdfc.slice/crio-06e96148507546bea2ce90772b77505f958aba91651447041a46f8dfd9be9ef7 WatchSource:0}: Error finding container 06e96148507546bea2ce90772b77505f958aba91651447041a46f8dfd9be9ef7: Status 404 returned error can't find the container with id 06e96148507546bea2ce90772b77505f958aba91651447041a46f8dfd9be9ef7 Apr 21 10:05:02.483340 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:02.483310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mwlr4" event={"ID":"fe7a5351-0fba-4368-9b98-1791bc7cfdfc","Type":"ContainerStarted","Data":"06e96148507546bea2ce90772b77505f958aba91651447041a46f8dfd9be9ef7"} Apr 21 10:05:05.489673 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:05.489639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mwlr4" event={"ID":"fe7a5351-0fba-4368-9b98-1791bc7cfdfc","Type":"ContainerStarted","Data":"cb5d9f844a7a9ee7222abbdc483e93b2e6198862616f26a170459fe8975db31b"} Apr 21 10:05:05.490051 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:05.489769 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:05:05.515616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:05.515573 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mwlr4" podStartSLOduration=66.943755467 podStartE2EDuration="1m9.515559412s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:05:02.252928464 +0000 UTC m=+66.647517714" lastFinishedPulling="2026-04-21 10:05:04.824732395 +0000 UTC m=+69.219321659" observedRunningTime="2026-04-21 10:05:05.513562066 +0000 UTC m=+69.908151337" watchObservedRunningTime="2026-04-21 10:05:05.515559412 +0000 UTC m=+69.910148729" Apr 21 10:05:33.853737 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:33.853701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:05:33.854198 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:33.853750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:05:33.854198 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:33.853836 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:33.854198 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:33.853856 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:33.854198 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:33.853891 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert podName:f34ed386-407f-400b-a309-9c15bf12db74 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:37.853875954 +0000 UTC m=+162.248465202 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert") pod "ingress-canary-p5cg2" (UID: "f34ed386-407f-400b-a309-9c15bf12db74") : secret "canary-serving-cert" not found Apr 21 10:05:33.854198 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:05:33.853934 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls podName:45c00ff2-e16b-4854-9279-a0a6d25f59c8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:37.853915402 +0000 UTC m=+162.248504651 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls") pod "dns-default-g62bp" (UID: "45c00ff2-e16b-4854-9279-a0a6d25f59c8") : secret "dns-default-metrics-tls" not found Apr 21 10:05:36.494627 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:05:36.494595 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mwlr4" Apr 21 10:06:05.970989 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:05.970938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:06:05.971515 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:05.971087 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:06:05.971515 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:05.971165 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs podName:2ae89f79-2df1-4414-b256-f90091f5fa3c nodeName:}" failed. No retries permitted until 2026-04-21 10:08:07.97114782 +0000 UTC m=+252.365737068 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs") pod "network-metrics-daemon-7czdf" (UID: "2ae89f79-2df1-4414-b256-f90091f5fa3c") : secret "metrics-daemon-secret" not found Apr 21 10:06:12.959972 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:12.959935 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn"] Apr 21 10:06:12.962010 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:12.961990 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" Apr 21 10:06:12.964608 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:12.964590 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-lkw4f\"" Apr 21 10:06:12.964685 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:12.964646 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:12.965646 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:12.965626 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:12.971922 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:12.971905 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn"] Apr 21 10:06:13.018052 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.018023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7xj\" (UniqueName: \"kubernetes.io/projected/a30b7cfa-6025-45f6-bc51-c813d60a38ae-kube-api-access-gn7xj\") pod \"volume-data-source-validator-7c6cbb6c87-9fjwn\" (UID: \"a30b7cfa-6025-45f6-bc51-c813d60a38ae\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" Apr 21 10:06:13.065725 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.065692 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k"] Apr 21 10:06:13.067760 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.067743 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm"] Apr 21 10:06:13.067915 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.067899 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.069492 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.069465 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.070618 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.070600 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-khjfc"] Apr 21 10:06:13.072559 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.072544 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-77wnm"] Apr 21 10:06:13.072669 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.072657 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.073267 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.073355 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073304 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 10:06:13.073441 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073352 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.073441 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073279 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.073441 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073304 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-42fl6\"" Apr 21 10:06:13.073441 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073417 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 10:06:13.073441 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073267 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zbg97\"" Apr 21 10:06:13.073659 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073267 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.073659 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.073309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 10:06:13.074238 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.074221 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b6946f86d-6v5gp"] Apr 21 10:06:13.074371 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.074355 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.075494 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.075469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 10:06:13.075924 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.075907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 10:06:13.076017 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.076001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.076136 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.076121 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.076203 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.076150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.077234 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.077215 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pf76z\"" Apr 21 10:06:13.077691 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.077674 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 10:06:13.077928 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.077909 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.079517 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.079493 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-s529l\"" Apr 21 10:06:13.079666 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.079608 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 10:06:13.079756 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.079679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:06:13.079890 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.079878 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nljzt\"" Apr 21 10:06:13.080461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.080437 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:06:13.080616 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.080598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:06:13.081833 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.081683 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.090249 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.090194 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k"] Apr 21 10:06:13.091233 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.091216 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 10:06:13.091635 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.091619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm"] Apr 21 10:06:13.092567 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.092545 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-khjfc"] Apr 21 10:06:13.093356 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.093335 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 10:06:13.093980 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.093959 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b6946f86d-6v5gp"] Apr 21 10:06:13.094612 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.094593 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:06:13.108471 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.108452 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-77wnm"] Apr 21 10:06:13.118777 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.118754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7xj\" (UniqueName: \"kubernetes.io/projected/a30b7cfa-6025-45f6-bc51-c813d60a38ae-kube-api-access-gn7xj\") pod \"volume-data-source-validator-7c6cbb6c87-9fjwn\" (UID: \"a30b7cfa-6025-45f6-bc51-c813d60a38ae\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" Apr 21 10:06:13.127900 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.127875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7xj\" (UniqueName: \"kubernetes.io/projected/a30b7cfa-6025-45f6-bc51-c813d60a38ae-kube-api-access-gn7xj\") pod \"volume-data-source-validator-7c6cbb6c87-9fjwn\" (UID: \"a30b7cfa-6025-45f6-bc51-c813d60a38ae\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" Apr 21 10:06:13.165820 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.165793 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz"] Apr 21 10:06:13.167780 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.167765 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t"] Apr 21 10:06:13.167926 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.167912 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.169761 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.169745 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.170203 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.170187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 10:06:13.170544 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.170520 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5f8vl\"" Apr 21 10:06:13.170801 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.170783 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-747657c5d4-nfxrn"] Apr 21 10:06:13.171810 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.171791 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 10:06:13.172644 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.172626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.173078 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.173053 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.173078 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.173061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-smjwv\"" Apr 21 10:06:13.173284 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.173096 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 10:06:13.173284 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.173113 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.173284 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.173128 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 10:06:13.175306 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.175247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-v29ck\"" Apr 21 10:06:13.175565 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.175546 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 10:06:13.175837 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.175820 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 10:06:13.176020 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.175992 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 10:06:13.176096 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.175906 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 10:06:13.176769 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.176750 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 10:06:13.177168 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.177150 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 10:06:13.181933 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.181909 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz"] Apr 21 10:06:13.183077 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.183057 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t"] Apr 21 10:06:13.185845 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.185825 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-747657c5d4-nfxrn"] Apr 21 10:06:13.220066 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/928066b1-04d2-4c0f-9561-862618e07065-ca-trust-extracted\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220066 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-trusted-ca\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.220197 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-serving-cert\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.220197 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-snapshots\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.220197 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d459929e-9c15-4174-a041-b14f3e183024-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.220298 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-registry-certificates\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220298 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqw8w\" (UniqueName: \"kubernetes.io/projected/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-kube-api-access-bqw8w\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.220298 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-bound-sa-token\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220298 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-serving-cert\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.220445 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhtwh\" (UniqueName: \"kubernetes.io/projected/81c76da7-40d4-4b08-985f-83b86ad934f2-kube-api-access-xhtwh\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.220445 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.220445 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-trusted-ca\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220542 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s4kh\" (UniqueName: \"kubernetes.io/projected/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-kube-api-access-7s4kh\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.220542 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78zmf\" (UniqueName: \"kubernetes.io/projected/d459929e-9c15-4174-a041-b14f3e183024-kube-api-access-78zmf\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.220542 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-tmp\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.220542 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-installation-pull-secrets\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220542 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.220682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220543 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-service-ca-bundle\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.220682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-config\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.220682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.220682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-image-registry-private-configuration\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.220682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.220633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnkh\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-kube-api-access-2tnkh\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.270421 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.270383 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" Apr 21 10:06:13.321332 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-serving-cert\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.321475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-default-certificate\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.321475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhtwh\" (UniqueName: \"kubernetes.io/projected/81c76da7-40d4-4b08-985f-83b86ad934f2-kube-api-access-xhtwh\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.321475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.321475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-trusted-ca\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7s4kh\" (UniqueName: \"kubernetes.io/projected/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-kube-api-access-7s4kh\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78zmf\" (UniqueName: \"kubernetes.io/projected/d459929e-9c15-4174-a041-b14f3e183024-kube-api-access-78zmf\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-tmp\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-installation-pull-secrets\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.321729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-stats-auth\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.322080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-service-ca-bundle\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.322080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-config\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.322080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.322080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.322080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.321849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.322375 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.322255 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:13.322472 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-tmp\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.322522 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.322522 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.322490 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls podName:81c76da7-40d4-4b08-985f-83b86ad934f2 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:13.822467274 +0000 UTC m=+138.217056527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8vhqm" (UID: "81c76da7-40d4-4b08-985f-83b86ad934f2") : secret "samples-operator-tls" not found Apr 21 10:06:13.322639 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-image-registry-private-configuration\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.322639 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnkh\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-kube-api-access-2tnkh\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.322639 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/09b7d935-6954-4915-8391-de3719c71560-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.322782 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.322660 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:13.322782 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.322676 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b6946f86d-6v5gp: secret "image-registry-tls" not found Apr 21 10:06:13.322782 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/928066b1-04d2-4c0f-9561-862618e07065-ca-trust-extracted\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.322782 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.322782 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.322738 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls podName:928066b1-04d2-4c0f-9561-862618e07065 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:13.82271847 +0000 UTC m=+138.217307735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls") pod "image-registry-7b6946f86d-6v5gp" (UID: "928066b1-04d2-4c0f-9561-862618e07065") : secret "image-registry-tls" not found Apr 21 10:06:13.322782 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.322575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-service-ca-bundle\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.323228 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-config\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.323334 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-trusted-ca\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.323334 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/928066b1-04d2-4c0f-9561-862618e07065-ca-trust-extracted\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.323334 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.323512 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszdc\" (UniqueName: \"kubernetes.io/projected/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-kube-api-access-lszdc\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.323512 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-serving-cert\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.323512 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.323447 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:13.323651 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.323522 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls podName:d459929e-9c15-4174-a041-b14f3e183024 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:13.823503582 +0000 UTC m=+138.218092850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzh5k" (UID: "d459929e-9c15-4174-a041-b14f3e183024") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:13.323651 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-snapshots\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.323651 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgmpk\" (UniqueName: \"kubernetes.io/projected/1181d1e9-066e-44e5-bc62-d300a81ad7a8-kube-api-access-qgmpk\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.323651 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d459929e-9c15-4174-a041-b14f3e183024-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.323862 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-registry-certificates\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.323862 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqw8w\" (UniqueName: \"kubernetes.io/projected/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-kube-api-access-bqw8w\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.323862 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-bound-sa-token\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.323862 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.323846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-snapshots\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.324697 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.324472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-trusted-ca\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.324697 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.324652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-trusted-ca\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.325123 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.325097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-registry-certificates\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.325317 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.325291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d459929e-9c15-4174-a041-b14f3e183024-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.325489 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.325465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-installation-pull-secrets\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.326209 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.326167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-serving-cert\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.327636 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.327596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-image-registry-private-configuration\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.327729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.327672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.328122 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.328090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-serving-cert\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.334003 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.333954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78zmf\" (UniqueName: \"kubernetes.io/projected/d459929e-9c15-4174-a041-b14f3e183024-kube-api-access-78zmf\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.335942 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.335714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-bound-sa-token\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.337308 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.337261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnkh\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-kube-api-access-2tnkh\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.337464 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.337441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhtwh\" (UniqueName: \"kubernetes.io/projected/81c76da7-40d4-4b08-985f-83b86ad934f2-kube-api-access-xhtwh\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.337550 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.337477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s4kh\" (UniqueName: \"kubernetes.io/projected/24013a62-41fe-4530-aa6f-3ebb1c0b54cc-kube-api-access-7s4kh\") pod \"console-operator-9d4b6777b-77wnm\" (UID: \"24013a62-41fe-4530-aa6f-3ebb1c0b54cc\") " pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.338349 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.338326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqw8w\" (UniqueName: \"kubernetes.io/projected/6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0-kube-api-access-bqw8w\") pod \"insights-operator-585dfdc468-khjfc\" (UID: \"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0\") " pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.386453 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.386424 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn"] Apr 21 10:06:13.389707 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:13.389680 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30b7cfa_6025_45f6_bc51_c813d60a38ae.slice/crio-7222999e565f69839117c895e5befb0712edff8fc724f8308008c691691322b5 WatchSource:0}: Error finding container 7222999e565f69839117c895e5befb0712edff8fc724f8308008c691691322b5: Status 404 returned error can't find the container with id 7222999e565f69839117c895e5befb0712edff8fc724f8308008c691691322b5 Apr 21 10:06:13.396311 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.396292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-khjfc" Apr 21 10:06:13.401863 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.401838 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:13.424784 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.424752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.424955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.424790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.424955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.424821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lszdc\" (UniqueName: \"kubernetes.io/projected/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-kube-api-access-lszdc\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.424955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.424876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgmpk\" (UniqueName: \"kubernetes.io/projected/1181d1e9-066e-44e5-bc62-d300a81ad7a8-kube-api-access-qgmpk\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.424955 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.424922 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:13.924899979 +0000 UTC m=+138.319489230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:13.425175 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.424964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-default-certificate\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.425175 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.425011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.425175 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.425043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-stats-auth\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.425175 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.425072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.425175 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.425139 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:13.425175 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.425178 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:13.925164488 +0000 UTC m=+138.319753737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : secret "router-metrics-certs-default" not found Apr 21 10:06:13.425529 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.425379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.425529 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.425453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/09b7d935-6954-4915-8391-de3719c71560-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.425529 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.425469 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:13.425529 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.425513 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert podName:09b7d935-6954-4915-8391-de3719c71560 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:13.925498313 +0000 UTC m=+138.320087562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2kfcz" (UID: "09b7d935-6954-4915-8391-de3719c71560") : secret "networking-console-plugin-cert" not found Apr 21 10:06:13.425733 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.425557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.426578 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.426552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/09b7d935-6954-4915-8391-de3719c71560-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.427595 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.427576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.427767 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.427748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-stats-auth\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.428548 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.428527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-default-certificate\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.435661 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.435641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszdc\" (UniqueName: \"kubernetes.io/projected/b613a503-a4ac-455e-80bf-2ffd14fe2b3d-kube-api-access-lszdc\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7d6t\" (UID: \"b613a503-a4ac-455e-80bf-2ffd14fe2b3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.435831 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.435809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgmpk\" (UniqueName: \"kubernetes.io/projected/1181d1e9-066e-44e5-bc62-d300a81ad7a8-kube-api-access-qgmpk\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.487982 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.487956 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" Apr 21 10:06:13.520745 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.520719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-khjfc"] Apr 21 10:06:13.523971 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:13.523945 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2829f4_6e5d_4759_a0ac_3e5e0085f9d0.slice/crio-fbeb70cd63aea59813796bd7de0c612ab0f44f1adda031fcb4f784b59271af8c WatchSource:0}: Error finding container fbeb70cd63aea59813796bd7de0c612ab0f44f1adda031fcb4f784b59271af8c: Status 404 returned error can't find the container with id fbeb70cd63aea59813796bd7de0c612ab0f44f1adda031fcb4f784b59271af8c Apr 21 10:06:13.537265 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.537237 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-77wnm"] Apr 21 10:06:13.541521 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:13.541487 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24013a62_41fe_4530_aa6f_3ebb1c0b54cc.slice/crio-56053786c31b51232b999c14b340b93ac3d994ebeec9801ca346600b6a511ace WatchSource:0}: Error finding container 56053786c31b51232b999c14b340b93ac3d994ebeec9801ca346600b6a511ace: Status 404 returned error can't find the container with id 56053786c31b51232b999c14b340b93ac3d994ebeec9801ca346600b6a511ace Apr 21 10:06:13.607124 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.607092 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t"] Apr 21 10:06:13.609677 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:13.609652 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb613a503_a4ac_455e_80bf_2ffd14fe2b3d.slice/crio-4afe759cccee76268c9b056ccc53d8e7967a9aa88a1f3941c0bc93d9cd0da659 WatchSource:0}: Error finding container 4afe759cccee76268c9b056ccc53d8e7967a9aa88a1f3941c0bc93d9cd0da659: Status 404 returned error can't find the container with id 4afe759cccee76268c9b056ccc53d8e7967a9aa88a1f3941c0bc93d9cd0da659 Apr 21 10:06:13.614895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.614873 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" event={"ID":"b613a503-a4ac-455e-80bf-2ffd14fe2b3d","Type":"ContainerStarted","Data":"4afe759cccee76268c9b056ccc53d8e7967a9aa88a1f3941c0bc93d9cd0da659"} Apr 21 10:06:13.615782 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.615762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" event={"ID":"a30b7cfa-6025-45f6-bc51-c813d60a38ae","Type":"ContainerStarted","Data":"7222999e565f69839117c895e5befb0712edff8fc724f8308008c691691322b5"} Apr 21 10:06:13.616607 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.616588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-khjfc" event={"ID":"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0","Type":"ContainerStarted","Data":"fbeb70cd63aea59813796bd7de0c612ab0f44f1adda031fcb4f784b59271af8c"} Apr 21 10:06:13.617454 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.617431 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" event={"ID":"24013a62-41fe-4530-aa6f-3ebb1c0b54cc","Type":"ContainerStarted","Data":"56053786c31b51232b999c14b340b93ac3d994ebeec9801ca346600b6a511ace"} Apr 21 10:06:13.828321 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.828239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:13.828321 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.828296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:13.828513 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.828333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:13.828513 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828384 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:13.828513 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828456 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls podName:81c76da7-40d4-4b08-985f-83b86ad934f2 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:14.828441115 +0000 UTC m=+139.223030369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8vhqm" (UID: "81c76da7-40d4-4b08-985f-83b86ad934f2") : secret "samples-operator-tls" not found Apr 21 10:06:13.828513 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828471 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:13.828513 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828475 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:13.828513 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828493 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b6946f86d-6v5gp: secret "image-registry-tls" not found Apr 21 10:06:13.828695 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828523 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls podName:d459929e-9c15-4174-a041-b14f3e183024 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:14.828508179 +0000 UTC m=+139.223097450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzh5k" (UID: "d459929e-9c15-4174-a041-b14f3e183024") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:13.828695 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.828539 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls podName:928066b1-04d2-4c0f-9561-862618e07065 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:14.828531199 +0000 UTC m=+139.223120449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls") pod "image-registry-7b6946f86d-6v5gp" (UID: "928066b1-04d2-4c0f-9561-862618e07065") : secret "image-registry-tls" not found Apr 21 10:06:13.929766 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.929734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.929947 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.929777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:13.929947 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.929855 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:13.929947 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.929869 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:13.929947 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:13.929931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:13.929947 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.929944 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:14.929920349 +0000 UTC m=+139.324509605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : secret "router-metrics-certs-default" not found Apr 21 10:06:13.930176 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.930021 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert podName:09b7d935-6954-4915-8391-de3719c71560 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:14.929999978 +0000 UTC m=+139.324589230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2kfcz" (UID: "09b7d935-6954-4915-8391-de3719c71560") : secret "networking-console-plugin-cert" not found Apr 21 10:06:13.930176 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:13.930044 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:14.930034429 +0000 UTC m=+139.324623687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:14.839644 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:14.839603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:14.840028 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:14.839663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:14.840028 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:14.839774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:14.840028 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.839908 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:14.840028 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.839968 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls podName:81c76da7-40d4-4b08-985f-83b86ad934f2 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.839950176 +0000 UTC m=+141.234539432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8vhqm" (UID: "81c76da7-40d4-4b08-985f-83b86ad934f2") : secret "samples-operator-tls" not found Apr 21 10:06:14.840623 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.840383 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:14.840623 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.840421 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b6946f86d-6v5gp: secret "image-registry-tls" not found Apr 21 10:06:14.840623 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.840481 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls podName:928066b1-04d2-4c0f-9561-862618e07065 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.840465328 +0000 UTC m=+141.235054592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls") pod "image-registry-7b6946f86d-6v5gp" (UID: "928066b1-04d2-4c0f-9561-862618e07065") : secret "image-registry-tls" not found Apr 21 10:06:14.840623 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.840543 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:14.840623 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.840581 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls podName:d459929e-9c15-4174-a041-b14f3e183024 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.840569726 +0000 UTC m=+141.235158974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzh5k" (UID: "d459929e-9c15-4174-a041-b14f3e183024") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:14.940725 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:14.940698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:14.940858 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:14.940834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:14.940926 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.940857 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.940838154 +0000 UTC m=+141.335427409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:14.940926 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:14.940901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:14.940926 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.940916 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:14.941082 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.940951 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.940940154 +0000 UTC m=+141.335529409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : secret "router-metrics-certs-default" not found Apr 21 10:06:14.941082 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.940996 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:14.941082 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:14.941025 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert podName:09b7d935-6954-4915-8391-de3719c71560 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.941018094 +0000 UTC m=+141.335607346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2kfcz" (UID: "09b7d935-6954-4915-8391-de3719c71560") : secret "networking-console-plugin-cert" not found Apr 21 10:06:15.623688 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:15.623653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" event={"ID":"a30b7cfa-6025-45f6-bc51-c813d60a38ae","Type":"ContainerStarted","Data":"a6f8667e72aa98e536c326f8bc80cd86bbd2a93c5a938b322bfd9eb58af8807e"} Apr 21 10:06:15.638765 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:15.638567 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9fjwn" podStartSLOduration=2.156320196 podStartE2EDuration="3.638550125s" podCreationTimestamp="2026-04-21 10:06:12 +0000 UTC" firstStartedPulling="2026-04-21 10:06:13.391866965 +0000 UTC m=+137.786456213" lastFinishedPulling="2026-04-21 10:06:14.874096869 +0000 UTC m=+139.268686142" observedRunningTime="2026-04-21 10:06:15.638357981 +0000 UTC m=+140.032947265" watchObservedRunningTime="2026-04-21 10:06:15.638550125 +0000 UTC m=+140.033139398" Apr 21 10:06:16.857625 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:16.857524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:16.857625 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:16.857573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:16.857664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857711 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857737 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b6946f86d-6v5gp: secret "image-registry-tls" not found Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857750 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857750 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857803 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls podName:81c76da7-40d4-4b08-985f-83b86ad934f2 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:20.857784798 +0000 UTC m=+145.252374047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8vhqm" (UID: "81c76da7-40d4-4b08-985f-83b86ad934f2") : secret "samples-operator-tls" not found Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857828 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls podName:928066b1-04d2-4c0f-9561-862618e07065 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:20.857819159 +0000 UTC m=+145.252408410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls") pod "image-registry-7b6946f86d-6v5gp" (UID: "928066b1-04d2-4c0f-9561-862618e07065") : secret "image-registry-tls" not found Apr 21 10:06:16.858044 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.857846 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls podName:d459929e-9c15-4174-a041-b14f3e183024 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:20.857837672 +0000 UTC m=+145.252426923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzh5k" (UID: "d459929e-9c15-4174-a041-b14f3e183024") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:16.958122 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:16.958081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:16.958122 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:16.958128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:16.958365 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:16.958167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:16.958365 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.958270 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:16.958365 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.958316 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:20.958300837 +0000 UTC m=+145.352890106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:16.958365 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.958316 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:16.958365 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.958346 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:20.958327334 +0000 UTC m=+145.352916589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : secret "router-metrics-certs-default" not found Apr 21 10:06:16.958656 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:16.958372 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert podName:09b7d935-6954-4915-8391-de3719c71560 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:20.958361088 +0000 UTC m=+145.352950337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2kfcz" (UID: "09b7d935-6954-4915-8391-de3719c71560") : secret "networking-console-plugin-cert" not found Apr 21 10:06:17.629329 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.629302 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/0.log" Apr 21 10:06:17.629515 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.629341 2577 generic.go:358] "Generic (PLEG): container finished" podID="24013a62-41fe-4530-aa6f-3ebb1c0b54cc" containerID="d208726753388776d24dbcd8e035d9abbfa53aa1c52c47775740b8c001c83a36" exitCode=255 Apr 21 10:06:17.629515 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.629426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" event={"ID":"24013a62-41fe-4530-aa6f-3ebb1c0b54cc","Type":"ContainerDied","Data":"d208726753388776d24dbcd8e035d9abbfa53aa1c52c47775740b8c001c83a36"} Apr 21 10:06:17.629727 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.629700 2577 scope.go:117] "RemoveContainer" containerID="d208726753388776d24dbcd8e035d9abbfa53aa1c52c47775740b8c001c83a36" Apr 21 10:06:17.630781 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.630755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" event={"ID":"b613a503-a4ac-455e-80bf-2ffd14fe2b3d","Type":"ContainerStarted","Data":"b001c368d82f2c6b56084ebb00d221f863632a74c7ab3e18b7964222f96df755"} Apr 21 10:06:17.632019 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.631996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-khjfc" event={"ID":"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0","Type":"ContainerStarted","Data":"690abdbe1689b5dffbb76e2807b5261503fa1c6ab3aafe95ba7784ee890b4eff"} Apr 21 10:06:17.668204 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.668158 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-khjfc" podStartSLOduration=1.633460073 podStartE2EDuration="4.668141638s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:13.525769475 +0000 UTC m=+137.920358724" lastFinishedPulling="2026-04-21 10:06:16.560451024 +0000 UTC m=+140.955040289" observedRunningTime="2026-04-21 10:06:17.666124236 +0000 UTC m=+142.060713508" watchObservedRunningTime="2026-04-21 10:06:17.668141638 +0000 UTC m=+142.062730912" Apr 21 10:06:17.681865 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:17.681819 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" podStartSLOduration=1.730875173 podStartE2EDuration="4.681802141s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:13.611510919 +0000 UTC m=+138.006100170" lastFinishedPulling="2026-04-21 10:06:16.562437874 +0000 UTC m=+140.957027138" observedRunningTime="2026-04-21 10:06:17.681423463 +0000 UTC m=+142.076012736" watchObservedRunningTime="2026-04-21 10:06:17.681802141 +0000 UTC m=+142.076391413" Apr 21 10:06:18.635337 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.635304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:06:18.635762 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.635670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/0.log" Apr 21 10:06:18.635762 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.635704 2577 generic.go:358] "Generic (PLEG): container finished" podID="24013a62-41fe-4530-aa6f-3ebb1c0b54cc" containerID="fbf70f1d77499cc33b669d5695beabd60acfd4719d3e48bef24f35a6efc77ea4" exitCode=255 Apr 21 10:06:18.635879 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.635794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" event={"ID":"24013a62-41fe-4530-aa6f-3ebb1c0b54cc","Type":"ContainerDied","Data":"fbf70f1d77499cc33b669d5695beabd60acfd4719d3e48bef24f35a6efc77ea4"} Apr 21 10:06:18.635879 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.635835 2577 scope.go:117] "RemoveContainer" containerID="d208726753388776d24dbcd8e035d9abbfa53aa1c52c47775740b8c001c83a36" Apr 21 10:06:18.636065 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.636050 2577 scope.go:117] "RemoveContainer" containerID="fbf70f1d77499cc33b669d5695beabd60acfd4719d3e48bef24f35a6efc77ea4" Apr 21 10:06:18.636268 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:18.636250 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-77wnm_openshift-console-operator(24013a62-41fe-4530-aa6f-3ebb1c0b54cc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" podUID="24013a62-41fe-4530-aa6f-3ebb1c0b54cc" Apr 21 10:06:18.845540 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.845498 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vql7f"] Apr 21 10:06:18.848110 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.848088 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:18.850787 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.850768 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qmswt\"" Apr 21 10:06:18.850895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.850792 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 10:06:18.850895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.850804 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 10:06:18.850895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.850834 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 10:06:18.850895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.850842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 10:06:18.855726 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.855707 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vql7f"] Apr 21 10:06:18.977571 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.977537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhnlm\" (UniqueName: \"kubernetes.io/projected/50b24fce-afea-4872-a164-b81589afa632-kube-api-access-rhnlm\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:18.977730 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.977599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/50b24fce-afea-4872-a164-b81589afa632-signing-key\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:18.977730 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:18.977682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/50b24fce-afea-4872-a164-b81589afa632-signing-cabundle\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.078978 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.078948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/50b24fce-afea-4872-a164-b81589afa632-signing-cabundle\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.079119 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.079067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhnlm\" (UniqueName: \"kubernetes.io/projected/50b24fce-afea-4872-a164-b81589afa632-kube-api-access-rhnlm\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.079155 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.079116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/50b24fce-afea-4872-a164-b81589afa632-signing-key\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.079718 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.079697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/50b24fce-afea-4872-a164-b81589afa632-signing-cabundle\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.081738 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.081718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/50b24fce-afea-4872-a164-b81589afa632-signing-key\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.087738 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.087716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhnlm\" (UniqueName: \"kubernetes.io/projected/50b24fce-afea-4872-a164-b81589afa632-kube-api-access-rhnlm\") pod \"service-ca-865cb79987-vql7f\" (UID: \"50b24fce-afea-4872-a164-b81589afa632\") " pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.157823 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.157784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vql7f" Apr 21 10:06:19.276913 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.276887 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vql7f"] Apr 21 10:06:19.279144 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:19.279117 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b24fce_afea_4872_a164_b81589afa632.slice/crio-8fdc392d3d4fc292a263c8f500c81c3dba4f6b2ec7e9223c7329665ece1938a8 WatchSource:0}: Error finding container 8fdc392d3d4fc292a263c8f500c81c3dba4f6b2ec7e9223c7329665ece1938a8: Status 404 returned error can't find the container with id 8fdc392d3d4fc292a263c8f500c81c3dba4f6b2ec7e9223c7329665ece1938a8 Apr 21 10:06:19.640260 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.640187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:06:19.640715 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.640652 2577 scope.go:117] "RemoveContainer" containerID="fbf70f1d77499cc33b669d5695beabd60acfd4719d3e48bef24f35a6efc77ea4" Apr 21 10:06:19.640892 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:19.640870 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-77wnm_openshift-console-operator(24013a62-41fe-4530-aa6f-3ebb1c0b54cc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" podUID="24013a62-41fe-4530-aa6f-3ebb1c0b54cc" Apr 21 10:06:19.641363 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:19.641335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vql7f" event={"ID":"50b24fce-afea-4872-a164-b81589afa632","Type":"ContainerStarted","Data":"8fdc392d3d4fc292a263c8f500c81c3dba4f6b2ec7e9223c7329665ece1938a8"} Apr 21 10:06:20.895954 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:20.895799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:20.895954 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:20.895865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:20.895954 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:20.895905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:20.895954 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.895935 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 10:06:20.896422 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.895996 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:20.896422 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.896003 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls podName:81c76da7-40d4-4b08-985f-83b86ad934f2 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:28.895983545 +0000 UTC m=+153.290572812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8vhqm" (UID: "81c76da7-40d4-4b08-985f-83b86ad934f2") : secret "samples-operator-tls" not found Apr 21 10:06:20.896422 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.896003 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:20.896422 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.896023 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b6946f86d-6v5gp: secret "image-registry-tls" not found Apr 21 10:06:20.896422 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.896060 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls podName:d459929e-9c15-4174-a041-b14f3e183024 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:28.896041336 +0000 UTC m=+153.290630593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzh5k" (UID: "d459929e-9c15-4174-a041-b14f3e183024") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:20.896422 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.896106 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls podName:928066b1-04d2-4c0f-9561-862618e07065 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:28.896092859 +0000 UTC m=+153.290682110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls") pod "image-registry-7b6946f86d-6v5gp" (UID: "928066b1-04d2-4c0f-9561-862618e07065") : secret "image-registry-tls" not found Apr 21 10:06:20.997056 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:20.997020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:20.997231 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:20.997067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:20.997231 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:20.997106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:20.997231 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.997166 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 10:06:20.997385 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.997239 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:28.997218485 +0000 UTC m=+153.391807734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : secret "router-metrics-certs-default" not found Apr 21 10:06:20.997385 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.997238 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:20.997385 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.997254 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle podName:1181d1e9-066e-44e5-bc62-d300a81ad7a8 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:28.997247667 +0000 UTC m=+153.391836916 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle") pod "router-default-747657c5d4-nfxrn" (UID: "1181d1e9-066e-44e5-bc62-d300a81ad7a8") : configmap references non-existent config key: service-ca.crt Apr 21 10:06:20.997385 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:20.997319 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert podName:09b7d935-6954-4915-8391-de3719c71560 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:28.997300494 +0000 UTC m=+153.391889757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2kfcz" (UID: "09b7d935-6954-4915-8391-de3719c71560") : secret "networking-console-plugin-cert" not found Apr 21 10:06:21.227280 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:21.227253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c9648_398c1473-0683-4af4-866e-a4c6405244ff/dns-node-resolver/0.log" Apr 21 10:06:21.651706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:21.651626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vql7f" event={"ID":"50b24fce-afea-4872-a164-b81589afa632","Type":"ContainerStarted","Data":"cc0e7bf8bd6d60e603325c9ae8e6ea2965c4058af856c3abc899c5d6e2b2c0ef"} Apr 21 10:06:21.670118 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:21.670051 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-vql7f" podStartSLOduration=2.08946541 podStartE2EDuration="3.670032894s" podCreationTimestamp="2026-04-21 10:06:18 +0000 UTC" firstStartedPulling="2026-04-21 10:06:19.280862464 +0000 UTC m=+143.675451722" lastFinishedPulling="2026-04-21 10:06:20.861429939 +0000 UTC m=+145.256019206" observedRunningTime="2026-04-21 10:06:21.668656771 +0000 UTC m=+146.063246043" watchObservedRunningTime="2026-04-21 10:06:21.670032894 +0000 UTC m=+146.064622166" Apr 21 10:06:22.028430 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:22.028401 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fznzv_1744533a-262f-4150-9f9b-9183b9e8576e/node-ca/0.log" Apr 21 10:06:23.402732 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:23.402685 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:23.402732 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:23.402742 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:23.403257 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:23.403186 2577 scope.go:117] "RemoveContainer" containerID="fbf70f1d77499cc33b669d5695beabd60acfd4719d3e48bef24f35a6efc77ea4" Apr 21 10:06:23.403456 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:23.403433 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-77wnm_openshift-console-operator(24013a62-41fe-4530-aa6f-3ebb1c0b54cc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" podUID="24013a62-41fe-4530-aa6f-3ebb1c0b54cc" Apr 21 10:06:23.831950 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:23.831921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-c7d6t_b613a503-a4ac-455e-80bf-2ffd14fe2b3d/kube-storage-version-migrator-operator/0.log" Apr 21 10:06:28.970438 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:28.970371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:28.970802 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:28.970465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:28.970802 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:28.970494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:28.970802 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:28.970638 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:28.970802 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:28.970701 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls podName:d459929e-9c15-4174-a041-b14f3e183024 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:44.970687261 +0000 UTC m=+169.365276510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzh5k" (UID: "d459929e-9c15-4174-a041-b14f3e183024") : secret "cluster-monitoring-operator-tls" not found Apr 21 10:06:28.973033 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:28.973008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c76da7-40d4-4b08-985f-83b86ad934f2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8vhqm\" (UID: \"81c76da7-40d4-4b08-985f-83b86ad934f2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:28.973180 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:28.973157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"image-registry-7b6946f86d-6v5gp\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:28.986783 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:28.986763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" Apr 21 10:06:29.008724 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.008694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:29.072459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.071704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:29.072459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.071841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:29.072459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.071873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:29.072459 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:29.072015 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 10:06:29.072459 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:29.072079 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert podName:09b7d935-6954-4915-8391-de3719c71560 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:45.072057207 +0000 UTC m=+169.466646458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2kfcz" (UID: "09b7d935-6954-4915-8391-de3719c71560") : secret "networking-console-plugin-cert" not found Apr 21 10:06:29.072459 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.072374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1181d1e9-066e-44e5-bc62-d300a81ad7a8-service-ca-bundle\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:29.074767 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.074745 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1181d1e9-066e-44e5-bc62-d300a81ad7a8-metrics-certs\") pod \"router-default-747657c5d4-nfxrn\" (UID: \"1181d1e9-066e-44e5-bc62-d300a81ad7a8\") " pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:29.092781 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.092747 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:29.115683 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.115632 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm"] Apr 21 10:06:29.141155 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.140878 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b6946f86d-6v5gp"] Apr 21 10:06:29.144413 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:29.144363 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928066b1_04d2_4c0f_9561_862618e07065.slice/crio-3aa06f5a41c149a49eadc7e2a07e739661475b2173b97620bc231834a3bf8ed6 WatchSource:0}: Error finding container 3aa06f5a41c149a49eadc7e2a07e739661475b2173b97620bc231834a3bf8ed6: Status 404 returned error can't find the container with id 3aa06f5a41c149a49eadc7e2a07e739661475b2173b97620bc231834a3bf8ed6 Apr 21 10:06:29.220207 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.220179 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-747657c5d4-nfxrn"] Apr 21 10:06:29.224026 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:29.224004 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1181d1e9_066e_44e5_bc62_d300a81ad7a8.slice/crio-86a00b43e0e3c91ca41721de3eac8d2129906f59674e221fb926969e08366166 WatchSource:0}: Error finding container 86a00b43e0e3c91ca41721de3eac8d2129906f59674e221fb926969e08366166: Status 404 returned error can't find the container with id 86a00b43e0e3c91ca41721de3eac8d2129906f59674e221fb926969e08366166 Apr 21 10:06:29.671933 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.671898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-747657c5d4-nfxrn" event={"ID":"1181d1e9-066e-44e5-bc62-d300a81ad7a8","Type":"ContainerStarted","Data":"dc4a0d8e219af82de10c09f988e86ebe44173501b4e20f32d779bb2a98ead077"} Apr 21 10:06:29.672089 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.671938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-747657c5d4-nfxrn" event={"ID":"1181d1e9-066e-44e5-bc62-d300a81ad7a8","Type":"ContainerStarted","Data":"86a00b43e0e3c91ca41721de3eac8d2129906f59674e221fb926969e08366166"} Apr 21 10:06:29.673196 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.673174 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" event={"ID":"928066b1-04d2-4c0f-9561-862618e07065","Type":"ContainerStarted","Data":"e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e"} Apr 21 10:06:29.673300 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.673200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" event={"ID":"928066b1-04d2-4c0f-9561-862618e07065","Type":"ContainerStarted","Data":"3aa06f5a41c149a49eadc7e2a07e739661475b2173b97620bc231834a3bf8ed6"} Apr 21 10:06:29.673300 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.673269 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:06:29.674071 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.674054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" event={"ID":"81c76da7-40d4-4b08-985f-83b86ad934f2","Type":"ContainerStarted","Data":"b71a08c145304cdf02db485b1f0fc83d97f7ae48a8ab8439d8b02429fe2b5e67"} Apr 21 10:06:29.690371 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.690331 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-747657c5d4-nfxrn" podStartSLOduration=16.690320422 podStartE2EDuration="16.690320422s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:29.689619796 +0000 UTC m=+154.084209060" watchObservedRunningTime="2026-04-21 10:06:29.690320422 +0000 UTC m=+154.084909692" Apr 21 10:06:29.707593 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:29.707557 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" podStartSLOduration=16.707545003 podStartE2EDuration="16.707545003s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:29.706341884 +0000 UTC m=+154.100931155" watchObservedRunningTime="2026-04-21 10:06:29.707545003 +0000 UTC m=+154.102134274" Apr 21 10:06:30.093888 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:30.093607 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:30.097210 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:30.097187 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:30.677760 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:30.677558 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:30.678743 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:30.678719 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-747657c5d4-nfxrn" Apr 21 10:06:31.681636 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:31.681595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" event={"ID":"81c76da7-40d4-4b08-985f-83b86ad934f2","Type":"ContainerStarted","Data":"2166fd834c3251e7ece970957889e826044a6675c9d56a8128ddda07517dcfb2"} Apr 21 10:06:31.681636 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:31.681638 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" event={"ID":"81c76da7-40d4-4b08-985f-83b86ad934f2","Type":"ContainerStarted","Data":"13afba999469a262538549d0b41f6fa67001a433e8a5229294d112609a0569dc"} Apr 21 10:06:31.696737 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:31.696642 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8vhqm" podStartSLOduration=16.987039859 podStartE2EDuration="18.696623205s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:29.161774591 +0000 UTC m=+153.556363841" lastFinishedPulling="2026-04-21 10:06:30.871357935 +0000 UTC m=+155.265947187" observedRunningTime="2026-04-21 10:06:31.695944764 +0000 UTC m=+156.090534032" watchObservedRunningTime="2026-04-21 10:06:31.696623205 +0000 UTC m=+156.091212478" Apr 21 10:06:32.986325 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:32.986217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-g62bp" podUID="45c00ff2-e16b-4854-9279-a0a6d25f59c8" Apr 21 10:06:32.998609 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:32.998583 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p5cg2" podUID="f34ed386-407f-400b-a309-9c15bf12db74" Apr 21 10:06:33.686996 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:33.686963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g62bp" Apr 21 10:06:34.199684 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:34.199651 2577 scope.go:117] "RemoveContainer" containerID="fbf70f1d77499cc33b669d5695beabd60acfd4719d3e48bef24f35a6efc77ea4" Apr 21 10:06:34.213866 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:06:34.213831 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7czdf" podUID="2ae89f79-2df1-4414-b256-f90091f5fa3c" Apr 21 10:06:34.690563 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:34.690538 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:06:34.690708 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:34.690628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" event={"ID":"24013a62-41fe-4530-aa6f-3ebb1c0b54cc","Type":"ContainerStarted","Data":"d8301948700aea5f14036f5c2fda6bcf1d621ed621d10a0d88cf5da9a875c7fe"} Apr 21 10:06:34.690938 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:34.690902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:34.711270 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:34.711224 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" podStartSLOduration=18.697872433 podStartE2EDuration="21.711208697s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:13.54343118 +0000 UTC m=+137.938020429" lastFinishedPulling="2026-04-21 10:06:16.556767442 +0000 UTC m=+140.951356693" observedRunningTime="2026-04-21 10:06:34.710419657 +0000 UTC m=+159.105008932" watchObservedRunningTime="2026-04-21 10:06:34.711208697 +0000 UTC m=+159.105797967" Apr 21 10:06:34.798950 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:34.798920 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-77wnm" Apr 21 10:06:37.949658 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:37.949576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:06:37.949658 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:37.949633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:06:37.952075 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:37.952056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c00ff2-e16b-4854-9279-a0a6d25f59c8-metrics-tls\") pod \"dns-default-g62bp\" (UID: \"45c00ff2-e16b-4854-9279-a0a6d25f59c8\") " pod="openshift-dns/dns-default-g62bp" Apr 21 10:06:37.952166 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:37.952148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f34ed386-407f-400b-a309-9c15bf12db74-cert\") pod \"ingress-canary-p5cg2\" (UID: \"f34ed386-407f-400b-a309-9c15bf12db74\") " pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:06:38.189981 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.189946 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-g7j6z\"" Apr 21 10:06:38.198264 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.198243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g62bp" Apr 21 10:06:38.281409 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.281063 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rpjhk"] Apr 21 10:06:38.293629 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.292681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.309007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.302217 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wh5gv\"" Apr 21 10:06:38.309007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.302533 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:06:38.309007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.302783 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:06:38.309007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.304848 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpjhk"] Apr 21 10:06:38.345665 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.345639 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b6946f86d-6v5gp"] Apr 21 10:06:38.353010 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.352918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9588b56f-2f02-407d-9d34-92fd50cd0ced-data-volume\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.353010 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.352959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vst4d\" (UniqueName: \"kubernetes.io/projected/9588b56f-2f02-407d-9d34-92fd50cd0ced-kube-api-access-vst4d\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.353237 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.353102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9588b56f-2f02-407d-9d34-92fd50cd0ced-crio-socket\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.353237 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.353156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9588b56f-2f02-407d-9d34-92fd50cd0ced-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.353237 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.353203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9588b56f-2f02-407d-9d34-92fd50cd0ced-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.354060 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.354039 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g62bp"] Apr 21 10:06:38.357699 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:38.357668 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c00ff2_e16b_4854_9279_a0a6d25f59c8.slice/crio-09e811c08a4b82bc5db548d94a86eb4168a480ce95927384a5aa9ae37a21d4c8 WatchSource:0}: Error finding container 09e811c08a4b82bc5db548d94a86eb4168a480ce95927384a5aa9ae37a21d4c8: Status 404 returned error can't find the container with id 09e811c08a4b82bc5db548d94a86eb4168a480ce95927384a5aa9ae37a21d4c8 Apr 21 10:06:38.358840 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.358630 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-l49kt"] Apr 21 10:06:38.363084 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.363069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:38.365285 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.365265 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2lm6n\"" Apr 21 10:06:38.365494 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.365473 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 10:06:38.366069 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.366048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 10:06:38.374045 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.373994 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l49kt"] Apr 21 10:06:38.453954 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.453926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9588b56f-2f02-407d-9d34-92fd50cd0ced-data-volume\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.453954 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.453959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vst4d\" (UniqueName: \"kubernetes.io/projected/9588b56f-2f02-407d-9d34-92fd50cd0ced-kube-api-access-vst4d\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.454198 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9588b56f-2f02-407d-9d34-92fd50cd0ced-crio-socket\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.454198 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9588b56f-2f02-407d-9d34-92fd50cd0ced-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.454198 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9588b56f-2f02-407d-9d34-92fd50cd0ced-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.454198 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hhr\" (UniqueName: \"kubernetes.io/projected/2bf51414-8294-4d06-a0ae-03141b8cbf38-kube-api-access-r5hhr\") pod \"downloads-6bcc868b7-l49kt\" (UID: \"2bf51414-8294-4d06-a0ae-03141b8cbf38\") " pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:38.454198 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9588b56f-2f02-407d-9d34-92fd50cd0ced-crio-socket\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.454360 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454298 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9588b56f-2f02-407d-9d34-92fd50cd0ced-data-volume\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.454672 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.454641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9588b56f-2f02-407d-9d34-92fd50cd0ced-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.456461 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.456420 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9588b56f-2f02-407d-9d34-92fd50cd0ced-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.471569 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.471549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vst4d\" (UniqueName: \"kubernetes.io/projected/9588b56f-2f02-407d-9d34-92fd50cd0ced-kube-api-access-vst4d\") pod \"insights-runtime-extractor-rpjhk\" (UID: \"9588b56f-2f02-407d-9d34-92fd50cd0ced\") " pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.555101 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.555068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hhr\" (UniqueName: \"kubernetes.io/projected/2bf51414-8294-4d06-a0ae-03141b8cbf38-kube-api-access-r5hhr\") pod \"downloads-6bcc868b7-l49kt\" (UID: \"2bf51414-8294-4d06-a0ae-03141b8cbf38\") " pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:38.563196 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.563170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hhr\" (UniqueName: \"kubernetes.io/projected/2bf51414-8294-4d06-a0ae-03141b8cbf38-kube-api-access-r5hhr\") pod \"downloads-6bcc868b7-l49kt\" (UID: \"2bf51414-8294-4d06-a0ae-03141b8cbf38\") " pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:38.621190 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.621165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpjhk" Apr 21 10:06:38.675094 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.674929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:38.707234 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.707151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g62bp" event={"ID":"45c00ff2-e16b-4854-9279-a0a6d25f59c8","Type":"ContainerStarted","Data":"09e811c08a4b82bc5db548d94a86eb4168a480ce95927384a5aa9ae37a21d4c8"} Apr 21 10:06:38.751151 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.751117 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpjhk"] Apr 21 10:06:38.754453 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:38.754416 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9588b56f_2f02_407d_9d34_92fd50cd0ced.slice/crio-1fc31878550c2855a4632d06bb41a5a46fb78a13356698a0f78833d4a5b44c28 WatchSource:0}: Error finding container 1fc31878550c2855a4632d06bb41a5a46fb78a13356698a0f78833d4a5b44c28: Status 404 returned error can't find the container with id 1fc31878550c2855a4632d06bb41a5a46fb78a13356698a0f78833d4a5b44c28 Apr 21 10:06:38.812038 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:38.812012 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l49kt"] Apr 21 10:06:38.814630 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:38.814603 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf51414_8294_4d06_a0ae_03141b8cbf38.slice/crio-df2b08219912f68374618d6b6edd7416b8113b518477475fcc8d5823f7169b76 WatchSource:0}: Error finding container df2b08219912f68374618d6b6edd7416b8113b518477475fcc8d5823f7169b76: Status 404 returned error can't find the container with id df2b08219912f68374618d6b6edd7416b8113b518477475fcc8d5823f7169b76 Apr 21 10:06:39.711796 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:39.711757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpjhk" event={"ID":"9588b56f-2f02-407d-9d34-92fd50cd0ced","Type":"ContainerStarted","Data":"47aba8972abc0ef043a66c3153afb36c86719a8f3ab289d7bd85ada5f9c50a50"} Apr 21 10:06:39.711796 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:39.711799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpjhk" event={"ID":"9588b56f-2f02-407d-9d34-92fd50cd0ced","Type":"ContainerStarted","Data":"1fc31878550c2855a4632d06bb41a5a46fb78a13356698a0f78833d4a5b44c28"} Apr 21 10:06:39.712963 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:39.712940 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l49kt" event={"ID":"2bf51414-8294-4d06-a0ae-03141b8cbf38","Type":"ContainerStarted","Data":"df2b08219912f68374618d6b6edd7416b8113b518477475fcc8d5823f7169b76"} Apr 21 10:06:40.718336 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:40.718290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g62bp" event={"ID":"45c00ff2-e16b-4854-9279-a0a6d25f59c8","Type":"ContainerStarted","Data":"68e7216cc8e5755b43726a6d6783c955cf0c877136927c9922002ed525f8f587"} Apr 21 10:06:40.718826 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:40.718343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g62bp" event={"ID":"45c00ff2-e16b-4854-9279-a0a6d25f59c8","Type":"ContainerStarted","Data":"a7dccb797c885fc0daf64b57370f9abf19d5ff6b57ff5311f3f43a1dfbbab831"} Apr 21 10:06:40.718826 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:40.718454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g62bp" Apr 21 10:06:40.720146 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:40.720114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpjhk" event={"ID":"9588b56f-2f02-407d-9d34-92fd50cd0ced","Type":"ContainerStarted","Data":"36c76b764c8bc31ad1a3cd3b4ba02f67a93d353b1c9b0721bfd05a1d60edd608"} Apr 21 10:06:40.736122 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:40.736070 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g62bp" podStartSLOduration=130.374640918 podStartE2EDuration="2m11.736052087s" podCreationTimestamp="2026-04-21 10:04:29 +0000 UTC" firstStartedPulling="2026-04-21 10:06:38.362296467 +0000 UTC m=+162.756885715" lastFinishedPulling="2026-04-21 10:06:39.723707632 +0000 UTC m=+164.118296884" observedRunningTime="2026-04-21 10:06:40.735182917 +0000 UTC m=+165.129772212" watchObservedRunningTime="2026-04-21 10:06:40.736052087 +0000 UTC m=+165.130641360" Apr 21 10:06:41.725414 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:41.725358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpjhk" event={"ID":"9588b56f-2f02-407d-9d34-92fd50cd0ced","Type":"ContainerStarted","Data":"d85241dbadd6bd1622f97f1f531e37e744a43fb22d09ae8e48c76317512069f3"} Apr 21 10:06:41.744721 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:41.744661 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rpjhk" podStartSLOduration=1.397470845 podStartE2EDuration="3.744645984s" podCreationTimestamp="2026-04-21 10:06:38 +0000 UTC" firstStartedPulling="2026-04-21 10:06:38.822584218 +0000 UTC m=+163.217173467" lastFinishedPulling="2026-04-21 10:06:41.169759356 +0000 UTC m=+165.564348606" observedRunningTime="2026-04-21 10:06:41.744426225 +0000 UTC m=+166.139015494" watchObservedRunningTime="2026-04-21 10:06:41.744645984 +0000 UTC m=+166.139235256" Apr 21 10:06:45.014240 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.014195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:45.016949 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.016924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d459929e-9c15-4174-a041-b14f3e183024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzh5k\" (UID: \"d459929e-9c15-4174-a041-b14f3e183024\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:45.115629 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.115594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:45.118508 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.118478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09b7d935-6954-4915-8391-de3719c71560-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2kfcz\" (UID: \"09b7d935-6954-4915-8391-de3719c71560\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:45.178310 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.178263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" Apr 21 10:06:45.279486 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.279449 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" Apr 21 10:06:45.317247 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.317196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k"] Apr 21 10:06:45.320881 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:45.320857 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd459929e_9c15_4174_a041_b14f3e183024.slice/crio-4dd291f15326df2bf7f5ff676d2085e5ad5a733f732ed96283c26c60731490ba WatchSource:0}: Error finding container 4dd291f15326df2bf7f5ff676d2085e5ad5a733f732ed96283c26c60731490ba: Status 404 returned error can't find the container with id 4dd291f15326df2bf7f5ff676d2085e5ad5a733f732ed96283c26c60731490ba Apr 21 10:06:45.419699 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.419665 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz"] Apr 21 10:06:45.422719 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:45.422682 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b7d935_6954_4915_8391_de3719c71560.slice/crio-4da616aa0668054f19da0ad9f484707163ffbab42ed5d07f63df0f3a3aa63717 WatchSource:0}: Error finding container 4da616aa0668054f19da0ad9f484707163ffbab42ed5d07f63df0f3a3aa63717: Status 404 returned error can't find the container with id 4da616aa0668054f19da0ad9f484707163ffbab42ed5d07f63df0f3a3aa63717 Apr 21 10:06:45.739217 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.739171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" event={"ID":"09b7d935-6954-4915-8391-de3719c71560","Type":"ContainerStarted","Data":"4da616aa0668054f19da0ad9f484707163ffbab42ed5d07f63df0f3a3aa63717"} Apr 21 10:06:45.740243 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:45.740215 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" event={"ID":"d459929e-9c15-4174-a041-b14f3e183024","Type":"ContainerStarted","Data":"4dd291f15326df2bf7f5ff676d2085e5ad5a733f732ed96283c26c60731490ba"} Apr 21 10:06:47.199732 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:47.199694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:06:47.202693 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:47.202668 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9st8k\"" Apr 21 10:06:47.210899 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:47.210874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5cg2" Apr 21 10:06:47.897041 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:47.897012 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p5cg2"] Apr 21 10:06:47.899996 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:47.899968 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34ed386_407f_400b_a309_9c15bf12db74.slice/crio-7d0b88f1850d1a4582e0a2a62124d47404f8e9d3ec1a20e8778f07d772790311 WatchSource:0}: Error finding container 7d0b88f1850d1a4582e0a2a62124d47404f8e9d3ec1a20e8778f07d772790311: Status 404 returned error can't find the container with id 7d0b88f1850d1a4582e0a2a62124d47404f8e9d3ec1a20e8778f07d772790311 Apr 21 10:06:48.200224 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.200188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:06:48.353320 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.353165 2577 patch_prober.go:28] interesting pod/image-registry-7b6946f86d-6v5gp container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 10:06:48.353320 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.353235 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" podUID="928066b1-04d2-4c0f-9561-862618e07065" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:06:48.371864 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.371833 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k"] Apr 21 10:06:48.377149 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.377127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:48.379745 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.379722 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 10:06:48.379865 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.379769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-t7jdh\"" Apr 21 10:06:48.388139 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.388070 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k"] Apr 21 10:06:48.445116 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.445078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/02d899a4-d56f-4804-bca2-78b9cd085a39-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9p68k\" (UID: \"02d899a4-d56f-4804-bca2-78b9cd085a39\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:48.545627 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.545537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/02d899a4-d56f-4804-bca2-78b9cd085a39-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9p68k\" (UID: \"02d899a4-d56f-4804-bca2-78b9cd085a39\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:48.548508 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.548483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/02d899a4-d56f-4804-bca2-78b9cd085a39-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9p68k\" (UID: \"02d899a4-d56f-4804-bca2-78b9cd085a39\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:48.689483 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.689443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:48.750498 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.750456 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" event={"ID":"d459929e-9c15-4174-a041-b14f3e183024","Type":"ContainerStarted","Data":"a1bcd7c3307fe3b1689d00e3abbf1cb941532a6805f21d1b5e63e8e2930b7071"} Apr 21 10:06:48.751749 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.751715 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p5cg2" event={"ID":"f34ed386-407f-400b-a309-9c15bf12db74","Type":"ContainerStarted","Data":"7d0b88f1850d1a4582e0a2a62124d47404f8e9d3ec1a20e8778f07d772790311"} Apr 21 10:06:48.753114 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.753091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" event={"ID":"09b7d935-6954-4915-8391-de3719c71560","Type":"ContainerStarted","Data":"71894a90a6688ad66034bafb2538ed684c5f439c4ad307e97b1c532e9ed7ec15"} Apr 21 10:06:48.766659 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.766462 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzh5k" podStartSLOduration=33.321482538 podStartE2EDuration="35.766445611s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:45.323354161 +0000 UTC m=+169.717943426" lastFinishedPulling="2026-04-21 10:06:47.768317239 +0000 UTC m=+172.162906499" observedRunningTime="2026-04-21 10:06:48.766264494 +0000 UTC m=+173.160853777" watchObservedRunningTime="2026-04-21 10:06:48.766445611 +0000 UTC m=+173.161034881" Apr 21 10:06:48.786767 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:48.786700 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2kfcz" podStartSLOduration=33.449098327 podStartE2EDuration="35.786683123s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="2026-04-21 10:06:45.424695799 +0000 UTC m=+169.819285049" lastFinishedPulling="2026-04-21 10:06:47.762280591 +0000 UTC m=+172.156869845" observedRunningTime="2026-04-21 10:06:48.784810489 +0000 UTC m=+173.179399759" watchObservedRunningTime="2026-04-21 10:06:48.786683123 +0000 UTC m=+173.181272395" Apr 21 10:06:50.728767 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:50.728723 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g62bp" Apr 21 10:06:54.985168 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:54.985115 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k"] Apr 21 10:06:54.988902 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:06:54.988873 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d899a4_d56f_4804_bca2_78b9cd085a39.slice/crio-90a08c2aa245995878a6598b26510e4db050a14a1994fa1950b24561b0967c85 WatchSource:0}: Error finding container 90a08c2aa245995878a6598b26510e4db050a14a1994fa1950b24561b0967c85: Status 404 returned error can't find the container with id 90a08c2aa245995878a6598b26510e4db050a14a1994fa1950b24561b0967c85 Apr 21 10:06:55.780692 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:55.780633 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" event={"ID":"02d899a4-d56f-4804-bca2-78b9cd085a39","Type":"ContainerStarted","Data":"90a08c2aa245995878a6598b26510e4db050a14a1994fa1950b24561b0967c85"} Apr 21 10:06:55.782418 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:55.782370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l49kt" event={"ID":"2bf51414-8294-4d06-a0ae-03141b8cbf38","Type":"ContainerStarted","Data":"0ea81d1013cf094651ff0ee100702f786208ac291b8086ffca1847c59d39184a"} Apr 21 10:06:55.782811 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:55.782587 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:55.797345 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:55.797313 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-l49kt" Apr 21 10:06:55.801619 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:55.801455 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-l49kt" podStartSLOduration=1.644484078 podStartE2EDuration="17.801439028s" podCreationTimestamp="2026-04-21 10:06:38 +0000 UTC" firstStartedPulling="2026-04-21 10:06:38.817111698 +0000 UTC m=+163.211700952" lastFinishedPulling="2026-04-21 10:06:54.974066645 +0000 UTC m=+179.368655902" observedRunningTime="2026-04-21 10:06:55.799070207 +0000 UTC m=+180.193659499" watchObservedRunningTime="2026-04-21 10:06:55.801439028 +0000 UTC m=+180.196028301" Apr 21 10:06:56.787192 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:56.787154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p5cg2" event={"ID":"f34ed386-407f-400b-a309-9c15bf12db74","Type":"ContainerStarted","Data":"bfdc11a3d19088ea47ba9159f6f1e0931b7fc18a8b53f1a68a4d34972b7a64c8"} Apr 21 10:06:56.789053 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:56.789023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" event={"ID":"02d899a4-d56f-4804-bca2-78b9cd085a39","Type":"ContainerStarted","Data":"04531a69168e1f2d400915102d6749a85d26037b951f5f567ef63d64bbffc5a8"} Apr 21 10:06:56.789178 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:56.789066 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:56.791269 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:56.791195 2577 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-57cf98b594-9p68k container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.134.0.19:8443/healthz\": dial tcp 10.134.0.19:8443: connect: connection refused" start-of-body= Apr 21 10:06:56.791269 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:56.791236 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" podUID="02d899a4-d56f-4804-bca2-78b9cd085a39" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.134.0.19:8443/healthz\": dial tcp 10.134.0.19:8443: connect: connection refused" Apr 21 10:06:56.812807 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:56.812756 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" podStartSLOduration=7.112981153 podStartE2EDuration="8.812740577s" podCreationTimestamp="2026-04-21 10:06:48 +0000 UTC" firstStartedPulling="2026-04-21 10:06:54.990363911 +0000 UTC m=+179.384953160" lastFinishedPulling="2026-04-21 10:06:56.690123322 +0000 UTC m=+181.084712584" observedRunningTime="2026-04-21 10:06:56.811001378 +0000 UTC m=+181.205590639" watchObservedRunningTime="2026-04-21 10:06:56.812740577 +0000 UTC m=+181.207329847" Apr 21 10:06:57.797001 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:57.796970 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9p68k" Apr 21 10:06:57.813737 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:57.813686 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p5cg2" podStartSLOduration=140.018001325 podStartE2EDuration="2m28.813668156s" podCreationTimestamp="2026-04-21 10:04:29 +0000 UTC" firstStartedPulling="2026-04-21 10:06:47.902457647 +0000 UTC m=+172.297046896" lastFinishedPulling="2026-04-21 10:06:56.698124478 +0000 UTC m=+181.092713727" observedRunningTime="2026-04-21 10:06:57.811433064 +0000 UTC m=+182.206022336" watchObservedRunningTime="2026-04-21 10:06:57.813668156 +0000 UTC m=+182.208257427" Apr 21 10:06:58.351048 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:06:58.351014 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:07:02.977284 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:02.977244 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7vnl9"] Apr 21 10:07:03.027976 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.027945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.034968 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.034942 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:07:03.037099 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.035381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wtmfz\"" Apr 21 10:07:03.037517 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.035524 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:07:03.037960 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.035589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:07:03.038185 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.035698 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:07:03.176846 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.176804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-sys\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.176858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-metrics-client-ca\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.176951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-textfile\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.176989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-root\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.177017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxllr\" (UniqueName: \"kubernetes.io/projected/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-kube-api-access-qxllr\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.177049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177320 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.177130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-tls\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177320 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.177207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-wtmp\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.177320 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.177228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-textfile\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-root\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxllr\" (UniqueName: \"kubernetes.io/projected/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-kube-api-access-qxllr\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-tls\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-wtmp\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-sys\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.278622 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.278471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-metrics-client-ca\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.279142 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.279116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-metrics-client-ca\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.279309 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.279248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-sys\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.279309 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:07:03.279265 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 10:07:03.279309 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.279304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-root\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.279500 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:07:03.279329 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-tls podName:66ce4f7a-aba6-4b97-8863-86ab1ef171c0 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:03.779310988 +0000 UTC m=+188.173900252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-tls") pod "node-exporter-7vnl9" (UID: "66ce4f7a-aba6-4b97-8863-86ab1ef171c0") : secret "node-exporter-tls" not found Apr 21 10:07:03.279679 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.279652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.279834 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.279773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-textfile\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.279919 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.279791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-wtmp\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.282207 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.282181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.305901 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.305874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxllr\" (UniqueName: \"kubernetes.io/projected/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-kube-api-access-qxllr\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.368925 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.368686 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" podUID="928066b1-04d2-4c0f-9561-862618e07065" containerName="registry" containerID="cri-o://e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e" gracePeriod=30 Apr 21 10:07:03.648535 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.648508 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:07:03.783689 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783655 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-trusted-ca\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.783874 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783707 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-image-registry-private-configuration\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.783874 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783758 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tnkh\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-kube-api-access-2tnkh\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.783874 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783782 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.783874 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783832 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-installation-pull-secrets\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.783874 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783862 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-bound-sa-token\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.784038 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783895 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/928066b1-04d2-4c0f-9561-862618e07065-ca-trust-extracted\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.784038 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.783921 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-registry-certificates\") pod \"928066b1-04d2-4c0f-9561-862618e07065\" (UID: \"928066b1-04d2-4c0f-9561-862618e07065\") " Apr 21 10:07:03.784112 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.784085 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:03.784189 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.784168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-tls\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.784250 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.784221 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-trusted-ca\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.784437 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.784411 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:03.786430 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.786362 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:03.786895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.786799 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:03.786895 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.786850 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:03.787588 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.787162 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-kube-api-access-2tnkh" (OuterVolumeSpecName: "kube-api-access-2tnkh") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "kube-api-access-2tnkh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:03.787588 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.787542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66ce4f7a-aba6-4b97-8863-86ab1ef171c0-node-exporter-tls\") pod \"node-exporter-7vnl9\" (UID: \"66ce4f7a-aba6-4b97-8863-86ab1ef171c0\") " pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.788200 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.788172 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:03.795702 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.795680 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928066b1-04d2-4c0f-9561-862618e07065-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "928066b1-04d2-4c0f-9561-862618e07065" (UID: "928066b1-04d2-4c0f-9561-862618e07065"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:07:03.811574 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.811511 2577 generic.go:358] "Generic (PLEG): container finished" podID="928066b1-04d2-4c0f-9561-862618e07065" containerID="e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e" exitCode=0 Apr 21 10:07:03.811688 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.811584 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" Apr 21 10:07:03.811688 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.811599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" event={"ID":"928066b1-04d2-4c0f-9561-862618e07065","Type":"ContainerDied","Data":"e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e"} Apr 21 10:07:03.811688 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.811642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b6946f86d-6v5gp" event={"ID":"928066b1-04d2-4c0f-9561-862618e07065","Type":"ContainerDied","Data":"3aa06f5a41c149a49eadc7e2a07e739661475b2173b97620bc231834a3bf8ed6"} Apr 21 10:07:03.811688 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.811664 2577 scope.go:117] "RemoveContainer" containerID="e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e" Apr 21 10:07:03.821213 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.821192 2577 scope.go:117] "RemoveContainer" containerID="e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e" Apr 21 10:07:03.821620 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:07:03.821594 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e\": container with ID starting with e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e not found: ID does not exist" containerID="e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e" Apr 21 10:07:03.821727 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.821630 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e"} err="failed to get container status \"e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e\": rpc error: code = NotFound desc = could not find container \"e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e\": container with ID starting with e5bcf5205551ac7ac0feb2377a078f38e8a773ca7ae151e90d9e71d2b09b470e not found: ID does not exist" Apr 21 10:07:03.838201 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.838168 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b6946f86d-6v5gp"] Apr 21 10:07:03.841311 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.841286 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7b6946f86d-6v5gp"] Apr 21 10:07:03.885657 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885623 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-installation-pull-secrets\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.885657 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885661 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-bound-sa-token\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.885882 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885679 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/928066b1-04d2-4c0f-9561-862618e07065-ca-trust-extracted\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.885882 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885694 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/928066b1-04d2-4c0f-9561-862618e07065-registry-certificates\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.885882 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885712 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/928066b1-04d2-4c0f-9561-862618e07065-image-registry-private-configuration\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.885882 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885725 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tnkh\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-kube-api-access-2tnkh\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.885882 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.885741 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/928066b1-04d2-4c0f-9561-862618e07065-registry-tls\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:07:03.943698 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:03.943661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7vnl9" Apr 21 10:07:03.954084 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:07:03.954044 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ce4f7a_aba6_4b97_8863_86ab1ef171c0.slice/crio-4cbb09face7070bd6a6375a985f4260077229f543d26e63b88f2019ed2f6adc5 WatchSource:0}: Error finding container 4cbb09face7070bd6a6375a985f4260077229f543d26e63b88f2019ed2f6adc5: Status 404 returned error can't find the container with id 4cbb09face7070bd6a6375a985f4260077229f543d26e63b88f2019ed2f6adc5 Apr 21 10:07:04.204705 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.204669 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928066b1-04d2-4c0f-9561-862618e07065" path="/var/lib/kubelet/pods/928066b1-04d2-4c0f-9561-862618e07065/volumes" Apr 21 10:07:04.824214 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.824168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vnl9" event={"ID":"66ce4f7a-aba6-4b97-8863-86ab1ef171c0","Type":"ContainerStarted","Data":"4cbb09face7070bd6a6375a985f4260077229f543d26e63b88f2019ed2f6adc5"} Apr 21 10:07:04.980104 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.980068 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c4586d5c6-fqbln"] Apr 21 10:07:04.980433 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.980420 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="928066b1-04d2-4c0f-9561-862618e07065" containerName="registry" Apr 21 10:07:04.980484 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.980436 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="928066b1-04d2-4c0f-9561-862618e07065" containerName="registry" Apr 21 10:07:04.980522 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.980493 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="928066b1-04d2-4c0f-9561-862618e07065" containerName="registry" Apr 21 10:07:04.996613 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.996581 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c4586d5c6-fqbln"] Apr 21 10:07:04.996775 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.996620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:04.999266 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.999236 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 10:07:04.999266 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.999238 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 10:07:04.999486 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.999301 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 10:07:04.999549 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.999520 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d8pbmgsiqcdjd\"" Apr 21 10:07:04.999549 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:04.999520 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-z68ln\"" Apr 21 10:07:05.000697 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.000445 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 10:07:05.000697 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.000447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 10:07:05.096880 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.096784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.096880 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.096841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-grpc-tls\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.097092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.096911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rwx\" (UniqueName: \"kubernetes.io/projected/7a60bfc1-ed25-4ed1-8e6e-906df625036f-kube-api-access-f2rwx\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.097092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.096938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-tls\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.097092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.096967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.097092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.097002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.097092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.097039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a60bfc1-ed25-4ed1-8e6e-906df625036f-metrics-client-ca\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.097092 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.097066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.197928 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.197886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rwx\" (UniqueName: \"kubernetes.io/projected/7a60bfc1-ed25-4ed1-8e6e-906df625036f-kube-api-access-f2rwx\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198089 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.197940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-tls\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198089 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.197978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198089 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.198021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198089 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.198050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a60bfc1-ed25-4ed1-8e6e-906df625036f-metrics-client-ca\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198089 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.198080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198359 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.198128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.198359 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.198194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-grpc-tls\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.199168 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.199137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a60bfc1-ed25-4ed1-8e6e-906df625036f-metrics-client-ca\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.201632 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.201600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-grpc-tls\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.201738 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.201679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-tls\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.201738 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.201706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.201974 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.201952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.202074 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.201958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.202124 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.202065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7a60bfc1-ed25-4ed1-8e6e-906df625036f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.206552 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.206525 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rwx\" (UniqueName: \"kubernetes.io/projected/7a60bfc1-ed25-4ed1-8e6e-906df625036f-kube-api-access-f2rwx\") pod \"thanos-querier-5c4586d5c6-fqbln\" (UID: \"7a60bfc1-ed25-4ed1-8e6e-906df625036f\") " pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.309277 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.309256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:05.439082 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.439005 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c4586d5c6-fqbln"] Apr 21 10:07:05.441602 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:07:05.441576 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a60bfc1_ed25_4ed1_8e6e_906df625036f.slice/crio-6fc9df31923787077bd5b34e85b8c974ff0817112e659ff990283e72aa61eb5b WatchSource:0}: Error finding container 6fc9df31923787077bd5b34e85b8c974ff0817112e659ff990283e72aa61eb5b: Status 404 returned error can't find the container with id 6fc9df31923787077bd5b34e85b8c974ff0817112e659ff990283e72aa61eb5b Apr 21 10:07:05.829402 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.829349 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"6fc9df31923787077bd5b34e85b8c974ff0817112e659ff990283e72aa61eb5b"} Apr 21 10:07:05.830743 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.830719 2577 generic.go:358] "Generic (PLEG): container finished" podID="66ce4f7a-aba6-4b97-8863-86ab1ef171c0" containerID="7cf1faa9f02c0b8b4df4a706208c364f90709cb3295916a959cc0c0c224345db" exitCode=0 Apr 21 10:07:05.830877 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:05.830760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vnl9" event={"ID":"66ce4f7a-aba6-4b97-8863-86ab1ef171c0","Type":"ContainerDied","Data":"7cf1faa9f02c0b8b4df4a706208c364f90709cb3295916a959cc0c0c224345db"} Apr 21 10:07:06.836475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:06.836436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vnl9" event={"ID":"66ce4f7a-aba6-4b97-8863-86ab1ef171c0","Type":"ContainerStarted","Data":"d9a163e699c44b5690f6f6b99eed18ad9fcf0a04b6aeb52159552fe9242c3f6f"} Apr 21 10:07:06.836475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:06.836476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vnl9" event={"ID":"66ce4f7a-aba6-4b97-8863-86ab1ef171c0","Type":"ContainerStarted","Data":"70b6a2ec6b492ed08339e26254760eb8ca97b6dea54e8a9b841eb2bad9901682"} Apr 21 10:07:06.858601 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:06.858552 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7vnl9" podStartSLOduration=3.573245059 podStartE2EDuration="4.858536443s" podCreationTimestamp="2026-04-21 10:07:02 +0000 UTC" firstStartedPulling="2026-04-21 10:07:03.956161157 +0000 UTC m=+188.350750411" lastFinishedPulling="2026-04-21 10:07:05.241452543 +0000 UTC m=+189.636041795" observedRunningTime="2026-04-21 10:07:06.856623729 +0000 UTC m=+191.251213003" watchObservedRunningTime="2026-04-21 10:07:06.858536443 +0000 UTC m=+191.253125713" Apr 21 10:07:07.636520 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.636492 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt"] Apr 21 10:07:07.656218 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.656194 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt"] Apr 21 10:07:07.656346 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.656334 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:07.659200 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.659164 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 10:07:07.659323 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.659206 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kl26l\"" Apr 21 10:07:07.820965 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.820933 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fbdea7ca-953d-42cf-a40b-8dcca399d130-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-csfpt\" (UID: \"fbdea7ca-953d-42cf-a40b-8dcca399d130\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:07.840798 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.840761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"5936169a8772065f7ad4ff9addc5c0d890f7b39488608d95d2359c3b051164aa"} Apr 21 10:07:07.841167 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.840808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"f556595f56352edddfdcba641ac95ecc31524e3ebd46df6c047a98469d6bfc9a"} Apr 21 10:07:07.922279 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:07.922248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fbdea7ca-953d-42cf-a40b-8dcca399d130-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-csfpt\" (UID: \"fbdea7ca-953d-42cf-a40b-8dcca399d130\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:07.922434 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:07:07.922413 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 10:07:07.922482 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:07:07.922476 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbdea7ca-953d-42cf-a40b-8dcca399d130-monitoring-plugin-cert podName:fbdea7ca-953d-42cf-a40b-8dcca399d130 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:08.422456696 +0000 UTC m=+192.817045946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/fbdea7ca-953d-42cf-a40b-8dcca399d130-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-csfpt" (UID: "fbdea7ca-953d-42cf-a40b-8dcca399d130") : secret "monitoring-plugin-cert" not found Apr 21 10:07:08.179296 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.179265 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87"] Apr 21 10:07:08.196781 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.196756 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87"] Apr 21 10:07:08.196921 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.196876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.199536 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.199411 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 10:07:08.199536 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.199476 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 10:07:08.199744 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.199601 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5t5ql\"" Apr 21 10:07:08.199744 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.199662 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 10:07:08.199879 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.199862 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 10:07:08.199940 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.199865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 10:07:08.204867 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.204843 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 10:07:08.326540 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-telemeter-client-tls\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.326722 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-serving-certs-ca-bundle\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.326722 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-metrics-client-ca\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.326722 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.326915 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-federate-client-tls\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.326978 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.326978 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-secret-telemeter-client\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.327074 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.326990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92c5\" (UniqueName: \"kubernetes.io/projected/650ceee4-df02-41d5-bcd2-65a7241631a1-kube-api-access-n92c5\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428382 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-telemeter-client-tls\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428582 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-serving-certs-ca-bundle\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428582 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-metrics-client-ca\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428582 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428753 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fbdea7ca-953d-42cf-a40b-8dcca399d130-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-csfpt\" (UID: \"fbdea7ca-953d-42cf-a40b-8dcca399d130\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:08.428753 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428705 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-federate-client-tls\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428753 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428903 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-secret-telemeter-client\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.428903 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.428801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n92c5\" (UniqueName: \"kubernetes.io/projected/650ceee4-df02-41d5-bcd2-65a7241631a1-kube-api-access-n92c5\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.429230 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.429172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-metrics-client-ca\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.430013 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.429692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-serving-certs-ca-bundle\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.430013 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.429816 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ceee4-df02-41d5-bcd2-65a7241631a1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.431516 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.431493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-federate-client-tls\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.432241 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.432194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-secret-telemeter-client\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.432241 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.432216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fbdea7ca-953d-42cf-a40b-8dcca399d130-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-csfpt\" (UID: \"fbdea7ca-953d-42cf-a40b-8dcca399d130\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:08.432241 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.432227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.432471 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.432264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/650ceee4-df02-41d5-bcd2-65a7241631a1-telemeter-client-tls\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.436897 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.436875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92c5\" (UniqueName: \"kubernetes.io/projected/650ceee4-df02-41d5-bcd2-65a7241631a1-kube-api-access-n92c5\") pod \"telemeter-client-7ccbd7dc58-x6k87\" (UID: \"650ceee4-df02-41d5-bcd2-65a7241631a1\") " pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.508246 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.508210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" Apr 21 10:07:08.567073 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.567035 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:08.765907 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.765873 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt"] Apr 21 10:07:08.769173 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:07:08.769130 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbdea7ca_953d_42cf_a40b_8dcca399d130.slice/crio-283a969b8ddb853099b00371bbe6600e26bf2a855fea311dc09b6f82fa4147a9 WatchSource:0}: Error finding container 283a969b8ddb853099b00371bbe6600e26bf2a855fea311dc09b6f82fa4147a9: Status 404 returned error can't find the container with id 283a969b8ddb853099b00371bbe6600e26bf2a855fea311dc09b6f82fa4147a9 Apr 21 10:07:08.790544 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.790519 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87"] Apr 21 10:07:08.791455 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:07:08.791416 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650ceee4_df02_41d5_bcd2_65a7241631a1.slice/crio-69266365ee04ba869fde34404daec4c50c0c16ba62137bde9e7882e967db6a61 WatchSource:0}: Error finding container 69266365ee04ba869fde34404daec4c50c0c16ba62137bde9e7882e967db6a61: Status 404 returned error can't find the container with id 69266365ee04ba869fde34404daec4c50c0c16ba62137bde9e7882e967db6a61 Apr 21 10:07:08.844684 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.844644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" event={"ID":"650ceee4-df02-41d5-bcd2-65a7241631a1","Type":"ContainerStarted","Data":"69266365ee04ba869fde34404daec4c50c0c16ba62137bde9e7882e967db6a61"} Apr 21 10:07:08.846242 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.846217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"660f0eac6880265ae6371b81471aef274f8ca3735398d5142a837693e0c94094"} Apr 21 10:07:08.847062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:08.847042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" event={"ID":"fbdea7ca-953d-42cf-a40b-8dcca399d130","Type":"ContainerStarted","Data":"283a969b8ddb853099b00371bbe6600e26bf2a855fea311dc09b6f82fa4147a9"} Apr 21 10:07:09.135464 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.135437 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:07:09.160307 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.160275 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:07:09.160509 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.160491 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.163161 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163131 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 10:07:09.163294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163176 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 10:07:09.163294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163188 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 10:07:09.163472 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163454 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 10:07:09.163626 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163606 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 10:07:09.163753 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163725 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 10:07:09.163838 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 10:07:09.163913 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163607 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-f9598e997gjsu\"" Apr 21 10:07:09.163980 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163690 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 10:07:09.164041 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 10:07:09.164095 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.163669 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 10:07:09.164501 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.164467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 10:07:09.164501 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.164489 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-w4hsd\"" Apr 21 10:07:09.164870 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.164843 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 10:07:09.167511 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.167491 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 10:07:09.337008 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.336979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337128 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337128 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337128 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-config\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337262 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337262 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337262 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337262 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ebd8ccf-3b5d-4d84-b499-255a5753f609-config-out\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337422 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337422 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337422 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337335 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337422 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjz92\" (UniqueName: \"kubernetes.io/projected/2ebd8ccf-3b5d-4d84-b499-255a5753f609-kube-api-access-fjz92\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337422 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337621 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337466 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ebd8ccf-3b5d-4d84-b499-255a5753f609-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337621 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337621 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337621 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-web-config\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.337753 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.337649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439149 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439149 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439417 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439417 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439547 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-config\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439547 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439547 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ebd8ccf-3b5d-4d84-b499-255a5753f609-config-out\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.439706 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjz92\" (UniqueName: \"kubernetes.io/projected/2ebd8ccf-3b5d-4d84-b499-255a5753f609-kube-api-access-fjz92\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440002 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440002 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ebd8ccf-3b5d-4d84-b499-255a5753f609-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440002 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440002 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440002 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-web-config\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440002 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.439975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.440954 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.440890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.441643 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.441475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.441962 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.441883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.442701 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.442668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.444621 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.444516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-web-config\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.444621 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.444610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-config\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.444886 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.444868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2ebd8ccf-3b5d-4d84-b499-255a5753f609-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.445190 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.445166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.446081 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.446035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.446081 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.446031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2ebd8ccf-3b5d-4d84-b499-255a5753f609-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.446287 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.446246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2ebd8ccf-3b5d-4d84-b499-255a5753f609-config-out\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.446412 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.446294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.446687 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.446661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.447214 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.447196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.447789 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.447772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjz92\" (UniqueName: \"kubernetes.io/projected/2ebd8ccf-3b5d-4d84-b499-255a5753f609-kube-api-access-fjz92\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.453689 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.453661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.453770 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.453663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2ebd8ccf-3b5d-4d84-b499-255a5753f609-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2ebd8ccf-3b5d-4d84-b499-255a5753f609\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.470564 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.470517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:09.623439 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.623358 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:07:09.857577 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.857484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"91df264cb234b37ae7743b993bc7f052c3efdfb0090172ce1d4b35f4cdc55925"} Apr 21 10:07:09.857577 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.857529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"40af6b7e5a75ca06012e66fea3e7471a31389c2ab16cd56993cf27a062fecfe2"} Apr 21 10:07:09.857577 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.857544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" event={"ID":"7a60bfc1-ed25-4ed1-8e6e-906df625036f","Type":"ContainerStarted","Data":"65e824c76e3cf9b62ed14998b63ba001ecb28d5f340ba287ce441b5e563d96e2"} Apr 21 10:07:09.858123 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.857666 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:09.878665 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:09.878607 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" podStartSLOduration=2.291512527 podStartE2EDuration="5.878588107s" podCreationTimestamp="2026-04-21 10:07:04 +0000 UTC" firstStartedPulling="2026-04-21 10:07:05.443587872 +0000 UTC m=+189.838177135" lastFinishedPulling="2026-04-21 10:07:09.030663451 +0000 UTC m=+193.425252715" observedRunningTime="2026-04-21 10:07:09.877318589 +0000 UTC m=+194.271907861" watchObservedRunningTime="2026-04-21 10:07:09.878588107 +0000 UTC m=+194.273177443" Apr 21 10:07:10.051935 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:07:10.051902 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ebd8ccf_3b5d_4d84_b499_255a5753f609.slice/crio-91ce3b7fe811f8ad4aa57d1a4472c85e3fe7b3fd23746c1f5c3186fb914df1a9 WatchSource:0}: Error finding container 91ce3b7fe811f8ad4aa57d1a4472c85e3fe7b3fd23746c1f5c3186fb914df1a9: Status 404 returned error can't find the container with id 91ce3b7fe811f8ad4aa57d1a4472c85e3fe7b3fd23746c1f5c3186fb914df1a9 Apr 21 10:07:10.862203 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:10.862160 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"91ce3b7fe811f8ad4aa57d1a4472c85e3fe7b3fd23746c1f5c3186fb914df1a9"} Apr 21 10:07:11.866261 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:11.866218 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" event={"ID":"fbdea7ca-953d-42cf-a40b-8dcca399d130","Type":"ContainerStarted","Data":"a9ed90b5b16b2c253c0fadd4d87e62ef7f47076f2000abbc0dd9200151d5b7f8"} Apr 21 10:07:11.866734 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:11.866431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:11.867721 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:11.867692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" event={"ID":"650ceee4-df02-41d5-bcd2-65a7241631a1","Type":"ContainerStarted","Data":"b0d00628425df6c4314074b03cd35a66bfcbc9934a4206d0389a22ba69360cec"} Apr 21 10:07:11.871505 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:11.871479 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" Apr 21 10:07:11.881262 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:11.881219 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-csfpt" podStartSLOduration=2.750759925 podStartE2EDuration="4.881207677s" podCreationTimestamp="2026-04-21 10:07:07 +0000 UTC" firstStartedPulling="2026-04-21 10:07:08.771177121 +0000 UTC m=+193.165766383" lastFinishedPulling="2026-04-21 10:07:10.901624885 +0000 UTC m=+195.296214135" observedRunningTime="2026-04-21 10:07:11.880250403 +0000 UTC m=+196.274839686" watchObservedRunningTime="2026-04-21 10:07:11.881207677 +0000 UTC m=+196.275796993" Apr 21 10:07:12.872190 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:12.872105 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ebd8ccf-3b5d-4d84-b499-255a5753f609" containerID="efb88a8bfe643857c74956ccaa03357b3eaf5f2b3afdd20f77e4938d3faa6270" exitCode=0 Apr 21 10:07:12.872681 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:12.872188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerDied","Data":"efb88a8bfe643857c74956ccaa03357b3eaf5f2b3afdd20f77e4938d3faa6270"} Apr 21 10:07:12.874450 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:12.874425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" event={"ID":"650ceee4-df02-41d5-bcd2-65a7241631a1","Type":"ContainerStarted","Data":"68f0c0dff17934a934c66a2b87497e21bf6829696a91b310d7613917e092b43a"} Apr 21 10:07:12.874561 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:12.874458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" event={"ID":"650ceee4-df02-41d5-bcd2-65a7241631a1","Type":"ContainerStarted","Data":"6588387402db1dffad0474300e38deed3765428f5d25862bec55e6bf20514425"} Apr 21 10:07:12.927948 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:12.927907 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7ccbd7dc58-x6k87" podStartSLOduration=1.378889615 podStartE2EDuration="4.927893383s" podCreationTimestamp="2026-04-21 10:07:08 +0000 UTC" firstStartedPulling="2026-04-21 10:07:08.793198883 +0000 UTC m=+193.187788133" lastFinishedPulling="2026-04-21 10:07:12.342202652 +0000 UTC m=+196.736791901" observedRunningTime="2026-04-21 10:07:12.926277891 +0000 UTC m=+197.320867174" watchObservedRunningTime="2026-04-21 10:07:12.927893383 +0000 UTC m=+197.322482653" Apr 21 10:07:15.868164 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:15.868132 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c4586d5c6-fqbln" Apr 21 10:07:16.889516 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.889481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"49ed0d7406ba9933ed89e7c69059442142892d58c3d8ecafbb3b6ecd36fe75cf"} Apr 21 10:07:16.889516 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.889516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"b55941066e588f873163b76e7378e2920f7a0fcdd31e5c484a549710aef63223"} Apr 21 10:07:16.889516 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.889526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"beb048c9e61b54b14e7c077c31da0c8b662322b535230f7e510988261f135b85"} Apr 21 10:07:16.889940 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.889534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"38b652f023ce242f322cceb8abb2cc36abc2384a93201e7f9c2302eb6d54874b"} Apr 21 10:07:16.889940 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.889542 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"1393c2d20feadce434c1c43f297ba336d5ff3e7d3632dcfb8e11ca6ff2f5a254"} Apr 21 10:07:16.889940 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.889549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2ebd8ccf-3b5d-4d84-b499-255a5753f609","Type":"ContainerStarted","Data":"b0c10e563b6c89b9bafe14d7f228f15e236ebf47cceae2d90a980d2ded57f0ce"} Apr 21 10:07:16.923386 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:16.923340 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.97999068 podStartE2EDuration="7.92332193s" podCreationTimestamp="2026-04-21 10:07:09 +0000 UTC" firstStartedPulling="2026-04-21 10:07:10.05436655 +0000 UTC m=+194.448955802" lastFinishedPulling="2026-04-21 10:07:15.997697797 +0000 UTC m=+200.392287052" observedRunningTime="2026-04-21 10:07:16.922057808 +0000 UTC m=+201.316647091" watchObservedRunningTime="2026-04-21 10:07:16.92332193 +0000 UTC m=+201.317911273" Apr 21 10:07:19.470791 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:19.470755 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:42.961869 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:42.961840 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0" containerID="690abdbe1689b5dffbb76e2807b5261503fa1c6ab3aafe95ba7784ee890b4eff" exitCode=0 Apr 21 10:07:42.962191 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:42.961886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-khjfc" event={"ID":"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0","Type":"ContainerDied","Data":"690abdbe1689b5dffbb76e2807b5261503fa1c6ab3aafe95ba7784ee890b4eff"} Apr 21 10:07:42.962230 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:42.962210 2577 scope.go:117] "RemoveContainer" containerID="690abdbe1689b5dffbb76e2807b5261503fa1c6ab3aafe95ba7784ee890b4eff" Apr 21 10:07:43.967446 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:43.967384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-khjfc" event={"ID":"6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0","Type":"ContainerStarted","Data":"397e619162f22c362675c0278bc000e82584370543894c5eb76eef392c49713b"} Apr 21 10:07:47.980525 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:47.980479 2577 generic.go:358] "Generic (PLEG): container finished" podID="b613a503-a4ac-455e-80bf-2ffd14fe2b3d" containerID="b001c368d82f2c6b56084ebb00d221f863632a74c7ab3e18b7964222f96df755" exitCode=0 Apr 21 10:07:47.980931 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:47.980560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" event={"ID":"b613a503-a4ac-455e-80bf-2ffd14fe2b3d","Type":"ContainerDied","Data":"b001c368d82f2c6b56084ebb00d221f863632a74c7ab3e18b7964222f96df755"} Apr 21 10:07:47.980931 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:47.980908 2577 scope.go:117] "RemoveContainer" containerID="b001c368d82f2c6b56084ebb00d221f863632a74c7ab3e18b7964222f96df755" Apr 21 10:07:48.985475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:07:48.985440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7d6t" event={"ID":"b613a503-a4ac-455e-80bf-2ffd14fe2b3d","Type":"ContainerStarted","Data":"39d99e35303fb928f32aef4e758aa1f9537220ad0b619d2ea6e6115d158d2359"} Apr 21 10:08:08.054291 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:08.054208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:08:08.056641 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:08.056623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f79-2df1-4414-b256-f90091f5fa3c-metrics-certs\") pod \"network-metrics-daemon-7czdf\" (UID: \"2ae89f79-2df1-4414-b256-f90091f5fa3c\") " pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:08:08.304239 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:08.304204 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7zbjr\"" Apr 21 10:08:08.312447 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:08.312373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7czdf" Apr 21 10:08:08.434196 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:08.434155 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7czdf"] Apr 21 10:08:08.436980 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:08:08.436951 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ae89f79_2df1_4414_b256_f90091f5fa3c.slice/crio-4b748b3d99b51fb640577a826643ead7cc95e858c482d576c0b8fedaa12f6c8b WatchSource:0}: Error finding container 4b748b3d99b51fb640577a826643ead7cc95e858c482d576c0b8fedaa12f6c8b: Status 404 returned error can't find the container with id 4b748b3d99b51fb640577a826643ead7cc95e858c482d576c0b8fedaa12f6c8b Apr 21 10:08:09.048923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:09.048888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7czdf" event={"ID":"2ae89f79-2df1-4414-b256-f90091f5fa3c","Type":"ContainerStarted","Data":"4b748b3d99b51fb640577a826643ead7cc95e858c482d576c0b8fedaa12f6c8b"} Apr 21 10:08:09.471543 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:09.471517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:09.488337 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:09.488309 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:10.054119 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:10.054084 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7czdf" event={"ID":"2ae89f79-2df1-4414-b256-f90091f5fa3c","Type":"ContainerStarted","Data":"e97e6fa805099c8121b2e3deb75388cc582922127d26e0b963af521b02c23fa5"} Apr 21 10:08:10.054119 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:10.054119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7czdf" event={"ID":"2ae89f79-2df1-4414-b256-f90091f5fa3c","Type":"ContainerStarted","Data":"462f24fdd27c2447c513b1b2c4cc134a06c21137a5fbe70113aa78dcac291749"} Apr 21 10:08:10.070234 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:10.070185 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7czdf" podStartSLOduration=253.116548073 podStartE2EDuration="4m14.070171107s" podCreationTimestamp="2026-04-21 10:03:56 +0000 UTC" firstStartedPulling="2026-04-21 10:08:08.439095605 +0000 UTC m=+252.833684854" lastFinishedPulling="2026-04-21 10:08:09.392718626 +0000 UTC m=+253.787307888" observedRunningTime="2026-04-21 10:08:10.069230151 +0000 UTC m=+254.463819435" watchObservedRunningTime="2026-04-21 10:08:10.070171107 +0000 UTC m=+254.464760377" Apr 21 10:08:10.070813 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:10.070790 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:56.087676 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:56.087641 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:08:56.088207 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:56.087922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:08:56.095021 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:56.095003 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:08:56.095150 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:56.095053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:08:56.097856 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:08:56.097840 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:09:02.983717 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:02.983680 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jsmds"] Apr 21 10:09:02.986940 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:02.986920 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:02.989961 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:02.989939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:09:02.995049 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:02.995025 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jsmds"] Apr 21 10:09:03.009960 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.009932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-original-pull-secret\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.010080 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.009977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-kubelet-config\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.010137 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.010109 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-dbus\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.111375 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.111345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-kubelet-config\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.111558 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.111465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-kubelet-config\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.111558 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.111470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-dbus\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.111558 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.111542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-original-pull-secret\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.111702 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.111610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-dbus\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.113849 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.113830 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d-original-pull-secret\") pod \"global-pull-secret-syncer-jsmds\" (UID: \"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d\") " pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.297073 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.296997 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jsmds" Apr 21 10:09:03.415478 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.415346 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jsmds"] Apr 21 10:09:03.418256 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:09:03.418227 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcfc06b_afe8_46d7_ae0e_1eef7cdb631d.slice/crio-feabd8f16fb82ad0e7ed2fbd67f72de44fa7c7671475c2280cdc8bf130fd209f WatchSource:0}: Error finding container feabd8f16fb82ad0e7ed2fbd67f72de44fa7c7671475c2280cdc8bf130fd209f: Status 404 returned error can't find the container with id feabd8f16fb82ad0e7ed2fbd67f72de44fa7c7671475c2280cdc8bf130fd209f Apr 21 10:09:03.420184 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:03.420167 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:09:04.221309 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:04.221273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jsmds" event={"ID":"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d","Type":"ContainerStarted","Data":"feabd8f16fb82ad0e7ed2fbd67f72de44fa7c7671475c2280cdc8bf130fd209f"} Apr 21 10:09:08.234462 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:08.234423 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jsmds" event={"ID":"6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d","Type":"ContainerStarted","Data":"da8923a3681bc6fb2c7332aa675b864985d38e885d450055649f889887a65642"} Apr 21 10:09:08.253970 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:09:08.253906 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jsmds" podStartSLOduration=2.440650309 podStartE2EDuration="6.253891117s" podCreationTimestamp="2026-04-21 10:09:02 +0000 UTC" firstStartedPulling="2026-04-21 10:09:03.420290143 +0000 UTC m=+307.814879392" lastFinishedPulling="2026-04-21 10:09:07.233530937 +0000 UTC m=+311.628120200" observedRunningTime="2026-04-21 10:09:08.253377117 +0000 UTC m=+312.647966387" watchObservedRunningTime="2026-04-21 10:09:08.253891117 +0000 UTC m=+312.648480388" Apr 21 10:10:59.794927 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.794896 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-cpkcs"] Apr 21 10:10:59.798107 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.798092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:10:59.800590 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.800564 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 10:10:59.800710 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.800598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-gmhlm\"" Apr 21 10:10:59.801641 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.801624 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 10:10:59.807697 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.807678 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-cpkcs"] Apr 21 10:10:59.895598 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.895558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2vn\" (UniqueName: \"kubernetes.io/projected/3d491cca-2419-45ef-8622-7ee0af76541c-kube-api-access-wm2vn\") pod \"cert-manager-webhook-587ccfb98-cpkcs\" (UID: \"3d491cca-2419-45ef-8622-7ee0af76541c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:10:59.895767 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.895622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d491cca-2419-45ef-8622-7ee0af76541c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-cpkcs\" (UID: \"3d491cca-2419-45ef-8622-7ee0af76541c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:10:59.996643 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.996606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm2vn\" (UniqueName: \"kubernetes.io/projected/3d491cca-2419-45ef-8622-7ee0af76541c-kube-api-access-wm2vn\") pod \"cert-manager-webhook-587ccfb98-cpkcs\" (UID: \"3d491cca-2419-45ef-8622-7ee0af76541c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:10:59.996835 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:10:59.996667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d491cca-2419-45ef-8622-7ee0af76541c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-cpkcs\" (UID: \"3d491cca-2419-45ef-8622-7ee0af76541c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:11:00.005358 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:00.005331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d491cca-2419-45ef-8622-7ee0af76541c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-cpkcs\" (UID: \"3d491cca-2419-45ef-8622-7ee0af76541c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:11:00.005761 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:00.005741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm2vn\" (UniqueName: \"kubernetes.io/projected/3d491cca-2419-45ef-8622-7ee0af76541c-kube-api-access-wm2vn\") pod \"cert-manager-webhook-587ccfb98-cpkcs\" (UID: \"3d491cca-2419-45ef-8622-7ee0af76541c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:11:00.122232 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:00.122151 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:11:00.244998 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:00.244975 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-cpkcs"] Apr 21 10:11:00.247840 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:11:00.247812 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d491cca_2419_45ef_8622_7ee0af76541c.slice/crio-2d1faa43e467f079e937b16cdb1ce4b04750abee531531cc60ca8f4a3aa96fa7 WatchSource:0}: Error finding container 2d1faa43e467f079e937b16cdb1ce4b04750abee531531cc60ca8f4a3aa96fa7: Status 404 returned error can't find the container with id 2d1faa43e467f079e937b16cdb1ce4b04750abee531531cc60ca8f4a3aa96fa7 Apr 21 10:11:00.560187 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:00.560151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" event={"ID":"3d491cca-2419-45ef-8622-7ee0af76541c","Type":"ContainerStarted","Data":"2d1faa43e467f079e937b16cdb1ce4b04750abee531531cc60ca8f4a3aa96fa7"} Apr 21 10:11:03.571759 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:03.571725 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" event={"ID":"3d491cca-2419-45ef-8622-7ee0af76541c","Type":"ContainerStarted","Data":"bdfc03b94f5da7e7c0a5d586f805171ab2bb0da2e54648abece35d0f1b612944"} Apr 21 10:11:03.571759 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:03.571776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:11:03.588900 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:03.588851 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" podStartSLOduration=1.801348143 podStartE2EDuration="4.588838182s" podCreationTimestamp="2026-04-21 10:10:59 +0000 UTC" firstStartedPulling="2026-04-21 10:11:00.250053727 +0000 UTC m=+424.644642977" lastFinishedPulling="2026-04-21 10:11:03.037543764 +0000 UTC m=+427.432133016" observedRunningTime="2026-04-21 10:11:03.587578062 +0000 UTC m=+427.982167333" watchObservedRunningTime="2026-04-21 10:11:03.588838182 +0000 UTC m=+427.983427452" Apr 21 10:11:09.577422 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:09.577322 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-cpkcs" Apr 21 10:11:12.748427 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.748374 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hqgth"] Apr 21 10:11:12.751836 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.751812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:12.754278 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.754257 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-srbr4\"" Apr 21 10:11:12.760558 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.760534 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hqgth"] Apr 21 10:11:12.812561 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.812536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c95c037-d32d-4d60-a0c4-e7e3e2240662-bound-sa-token\") pod \"cert-manager-79c8d999ff-hqgth\" (UID: \"7c95c037-d32d-4d60-a0c4-e7e3e2240662\") " pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:12.812693 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.812575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6jf\" (UniqueName: \"kubernetes.io/projected/7c95c037-d32d-4d60-a0c4-e7e3e2240662-kube-api-access-7l6jf\") pod \"cert-manager-79c8d999ff-hqgth\" (UID: \"7c95c037-d32d-4d60-a0c4-e7e3e2240662\") " pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:12.913843 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.913809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c95c037-d32d-4d60-a0c4-e7e3e2240662-bound-sa-token\") pod \"cert-manager-79c8d999ff-hqgth\" (UID: \"7c95c037-d32d-4d60-a0c4-e7e3e2240662\") " pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:12.914007 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.913851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l6jf\" (UniqueName: \"kubernetes.io/projected/7c95c037-d32d-4d60-a0c4-e7e3e2240662-kube-api-access-7l6jf\") pod \"cert-manager-79c8d999ff-hqgth\" (UID: \"7c95c037-d32d-4d60-a0c4-e7e3e2240662\") " pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:12.922758 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.922731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c95c037-d32d-4d60-a0c4-e7e3e2240662-bound-sa-token\") pod \"cert-manager-79c8d999ff-hqgth\" (UID: \"7c95c037-d32d-4d60-a0c4-e7e3e2240662\") " pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:12.923012 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:12.922987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l6jf\" (UniqueName: \"kubernetes.io/projected/7c95c037-d32d-4d60-a0c4-e7e3e2240662-kube-api-access-7l6jf\") pod \"cert-manager-79c8d999ff-hqgth\" (UID: \"7c95c037-d32d-4d60-a0c4-e7e3e2240662\") " pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:13.061424 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:13.061320 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hqgth" Apr 21 10:11:13.184195 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:13.184170 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hqgth"] Apr 21 10:11:13.186020 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:11:13.185993 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c95c037_d32d_4d60_a0c4_e7e3e2240662.slice/crio-1ed6ef744115d7b80e7f80d5855f1427a92afae5d4f9a1890824b404ac3b6590 WatchSource:0}: Error finding container 1ed6ef744115d7b80e7f80d5855f1427a92afae5d4f9a1890824b404ac3b6590: Status 404 returned error can't find the container with id 1ed6ef744115d7b80e7f80d5855f1427a92afae5d4f9a1890824b404ac3b6590 Apr 21 10:11:13.611573 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:13.611532 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hqgth" event={"ID":"7c95c037-d32d-4d60-a0c4-e7e3e2240662","Type":"ContainerStarted","Data":"7cfa5fbee7ce253288b99fe20c240c708319908e0f8de573339765679a28adda"} Apr 21 10:11:13.611573 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:13.611570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hqgth" event={"ID":"7c95c037-d32d-4d60-a0c4-e7e3e2240662","Type":"ContainerStarted","Data":"1ed6ef744115d7b80e7f80d5855f1427a92afae5d4f9a1890824b404ac3b6590"} Apr 21 10:11:13.626849 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:13.626798 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-hqgth" podStartSLOduration=1.626784102 podStartE2EDuration="1.626784102s" podCreationTimestamp="2026-04-21 10:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:11:13.626549081 +0000 UTC m=+438.021138353" watchObservedRunningTime="2026-04-21 10:11:13.626784102 +0000 UTC m=+438.021373373" Apr 21 10:11:42.862177 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.862145 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j"] Apr 21 10:11:42.865095 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.865078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:42.869833 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.869812 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 10:11:42.869941 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.869812 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 10:11:42.870475 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.870456 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lmcjb\"" Apr 21 10:11:42.870533 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.870489 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 10:11:42.870533 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.870496 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:11:42.870533 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.870491 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 10:11:42.878286 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.878265 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j"] Apr 21 10:11:42.965689 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.965657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pmv\" (UniqueName: \"kubernetes.io/projected/7757718f-8e4e-4339-812a-dfa40f1d911f-kube-api-access-f4pmv\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:42.965844 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.965701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7757718f-8e4e-4339-812a-dfa40f1d911f-metrics-cert\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:42.965844 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.965795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7757718f-8e4e-4339-812a-dfa40f1d911f-cert\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:42.965844 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:42.965825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7757718f-8e4e-4339-812a-dfa40f1d911f-manager-config\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.066494 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.066462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7757718f-8e4e-4339-812a-dfa40f1d911f-metrics-cert\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.066654 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.066515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7757718f-8e4e-4339-812a-dfa40f1d911f-cert\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.066654 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.066538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7757718f-8e4e-4339-812a-dfa40f1d911f-manager-config\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.066654 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.066586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pmv\" (UniqueName: \"kubernetes.io/projected/7757718f-8e4e-4339-812a-dfa40f1d911f-kube-api-access-f4pmv\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.067213 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.067159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7757718f-8e4e-4339-812a-dfa40f1d911f-manager-config\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.069068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.069045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7757718f-8e4e-4339-812a-dfa40f1d911f-metrics-cert\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.069177 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.069158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7757718f-8e4e-4339-812a-dfa40f1d911f-cert\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.075022 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.074997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pmv\" (UniqueName: \"kubernetes.io/projected/7757718f-8e4e-4339-812a-dfa40f1d911f-kube-api-access-f4pmv\") pod \"lws-controller-manager-5bbdf94c78-rw82j\" (UID: \"7757718f-8e4e-4339-812a-dfa40f1d911f\") " pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.175752 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.175722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:43.300852 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.300820 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j"] Apr 21 10:11:43.304456 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:11:43.304428 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7757718f_8e4e_4339_812a_dfa40f1d911f.slice/crio-7fe9cf71008be2fe9a09a5e83b5687bb7de66ac4e441197d9110b6364ff89f76 WatchSource:0}: Error finding container 7fe9cf71008be2fe9a09a5e83b5687bb7de66ac4e441197d9110b6364ff89f76: Status 404 returned error can't find the container with id 7fe9cf71008be2fe9a09a5e83b5687bb7de66ac4e441197d9110b6364ff89f76 Apr 21 10:11:43.707811 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:43.707776 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" event={"ID":"7757718f-8e4e-4339-812a-dfa40f1d911f","Type":"ContainerStarted","Data":"7fe9cf71008be2fe9a09a5e83b5687bb7de66ac4e441197d9110b6364ff89f76"} Apr 21 10:11:46.721079 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:46.721043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" event={"ID":"7757718f-8e4e-4339-812a-dfa40f1d911f","Type":"ContainerStarted","Data":"132efc65e3d6d16d7621ad44c60e73dc5896f3b432ac2c56558b2d6d3567e716"} Apr 21 10:11:46.721480 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:46.721100 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:11:46.737837 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:46.737787 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" podStartSLOduration=2.148180346 podStartE2EDuration="4.737773209s" podCreationTimestamp="2026-04-21 10:11:42 +0000 UTC" firstStartedPulling="2026-04-21 10:11:43.306199823 +0000 UTC m=+467.700789072" lastFinishedPulling="2026-04-21 10:11:45.895792683 +0000 UTC m=+470.290381935" observedRunningTime="2026-04-21 10:11:46.736444746 +0000 UTC m=+471.131034017" watchObservedRunningTime="2026-04-21 10:11:46.737773209 +0000 UTC m=+471.132362480" Apr 21 10:11:57.726321 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:11:57.726287 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bbdf94c78-rw82j" Apr 21 10:12:14.205376 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.205347 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph"] Apr 21 10:12:14.208446 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.208430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.210920 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.210897 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 10:12:14.211011 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.210956 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-d6chc\"" Apr 21 10:12:14.221269 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.221243 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph"] Apr 21 10:12:14.336156 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336156 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/39246080-2914-48cc-ba31-4e0e831a5cfb-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tb4\" (UniqueName: \"kubernetes.io/projected/39246080-2914-48cc-ba31-4e0e831a5cfb-kube-api-access-q9tb4\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.336593 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.336456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437325 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437325 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437325 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/39246080-2914-48cc-ba31-4e0e831a5cfb-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437692 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437692 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437692 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tb4\" (UniqueName: \"kubernetes.io/projected/39246080-2914-48cc-ba31-4e0e831a5cfb-kube-api-access-q9tb4\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437692 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437692 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437938 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.437938 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.437841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.438070 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.438037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.438150 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.438084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.438150 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.438118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/39246080-2914-48cc-ba31-4e0e831a5cfb-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.438150 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.438131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.439969 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.439949 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.440116 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.440101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.445292 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.445268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39246080-2914-48cc-ba31-4e0e831a5cfb-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.445450 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.445433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tb4\" (UniqueName: \"kubernetes.io/projected/39246080-2914-48cc-ba31-4e0e831a5cfb-kube-api-access-q9tb4\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4bfph\" (UID: \"39246080-2914-48cc-ba31-4e0e831a5cfb\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.518862 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.518776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:14.642737 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.642697 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph"] Apr 21 10:12:14.646551 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:12:14.646521 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39246080_2914_48cc_ba31_4e0e831a5cfb.slice/crio-e94a712413a976174d51c05d20e02880fc0e1edfe2054490640e7a8fd6a94b1b WatchSource:0}: Error finding container e94a712413a976174d51c05d20e02880fc0e1edfe2054490640e7a8fd6a94b1b: Status 404 returned error can't find the container with id e94a712413a976174d51c05d20e02880fc0e1edfe2054490640e7a8fd6a94b1b Apr 21 10:12:14.815226 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:14.815143 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" event={"ID":"39246080-2914-48cc-ba31-4e0e831a5cfb","Type":"ContainerStarted","Data":"e94a712413a976174d51c05d20e02880fc0e1edfe2054490640e7a8fd6a94b1b"} Apr 21 10:12:17.527841 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:17.527792 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 10:12:17.528123 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:17.527870 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 10:12:17.528123 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:17.527897 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 10:12:17.827372 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:17.827332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" event={"ID":"39246080-2914-48cc-ba31-4e0e831a5cfb","Type":"ContainerStarted","Data":"4bdb0416da26aa144daf3f1b1bce99fb115d55d29aecce14c4506a8a8a590cd9"} Apr 21 10:12:17.848556 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:17.848494 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" podStartSLOduration=0.96946396 podStartE2EDuration="3.848476011s" podCreationTimestamp="2026-04-21 10:12:14 +0000 UTC" firstStartedPulling="2026-04-21 10:12:14.648546318 +0000 UTC m=+499.043135569" lastFinishedPulling="2026-04-21 10:12:17.52755837 +0000 UTC m=+501.922147620" observedRunningTime="2026-04-21 10:12:17.847493337 +0000 UTC m=+502.242082622" watchObservedRunningTime="2026-04-21 10:12:17.848476011 +0000 UTC m=+502.243065280" Apr 21 10:12:18.519845 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:18.519806 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:18.524293 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:18.524269 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:18.830698 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:18.830615 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:12:18.831524 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:12:18.831503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4bfph" Apr 21 10:13:28.180536 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.180501 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-krfcv"] Apr 21 10:13:28.183949 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.183929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:13:28.186347 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.186314 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 10:13:28.186477 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.186356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-pk4bj\"" Apr 21 10:13:28.187426 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.187382 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 10:13:28.193203 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.193182 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-krfcv"] Apr 21 10:13:28.301519 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.301487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhxs4\" (UniqueName: \"kubernetes.io/projected/1cc99ee8-6691-45dc-ba78-853204f37b27-kube-api-access-zhxs4\") pod \"authorino-674b59b84c-krfcv\" (UID: \"1cc99ee8-6691-45dc-ba78-853204f37b27\") " pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:13:28.402357 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.402314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhxs4\" (UniqueName: \"kubernetes.io/projected/1cc99ee8-6691-45dc-ba78-853204f37b27-kube-api-access-zhxs4\") pod \"authorino-674b59b84c-krfcv\" (UID: \"1cc99ee8-6691-45dc-ba78-853204f37b27\") " pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:13:28.410905 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.410884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhxs4\" (UniqueName: \"kubernetes.io/projected/1cc99ee8-6691-45dc-ba78-853204f37b27-kube-api-access-zhxs4\") pod \"authorino-674b59b84c-krfcv\" (UID: \"1cc99ee8-6691-45dc-ba78-853204f37b27\") " pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:13:28.494289 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.494222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:13:28.611235 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:28.611160 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-krfcv"] Apr 21 10:13:28.613674 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:13:28.613644 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc99ee8_6691_45dc_ba78_853204f37b27.slice/crio-39f08957f23128854112c4a960e564b6f7df9123ba78d60dad7cae0420af3f2b WatchSource:0}: Error finding container 39f08957f23128854112c4a960e564b6f7df9123ba78d60dad7cae0420af3f2b: Status 404 returned error can't find the container with id 39f08957f23128854112c4a960e564b6f7df9123ba78d60dad7cae0420af3f2b Apr 21 10:13:29.085835 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:29.085796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-krfcv" event={"ID":"1cc99ee8-6691-45dc-ba78-853204f37b27","Type":"ContainerStarted","Data":"39f08957f23128854112c4a960e564b6f7df9123ba78d60dad7cae0420af3f2b"} Apr 21 10:13:32.106847 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:32.106809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-krfcv" event={"ID":"1cc99ee8-6691-45dc-ba78-853204f37b27","Type":"ContainerStarted","Data":"97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6"} Apr 21 10:13:32.123025 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:32.122973 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-krfcv" podStartSLOduration=1.29491012 podStartE2EDuration="4.122959452s" podCreationTimestamp="2026-04-21 10:13:28 +0000 UTC" firstStartedPulling="2026-04-21 10:13:28.615247958 +0000 UTC m=+573.009837207" lastFinishedPulling="2026-04-21 10:13:31.44329729 +0000 UTC m=+575.837886539" observedRunningTime="2026-04-21 10:13:32.120455685 +0000 UTC m=+576.515044980" watchObservedRunningTime="2026-04-21 10:13:32.122959452 +0000 UTC m=+576.517548724" Apr 21 10:13:56.114105 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:56.114079 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:13:56.114675 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:56.114079 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:13:56.120583 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:56.120566 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:13:56.120691 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:13:56.120611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:18:27.638801 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:27.638768 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-krfcv"] Apr 21 10:18:27.639360 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:27.638972 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-krfcv" podUID="1cc99ee8-6691-45dc-ba78-853204f37b27" containerName="authorino" containerID="cri-o://97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6" gracePeriod=30 Apr 21 10:18:27.875167 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:27.875145 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:18:27.988813 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:27.988781 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhxs4\" (UniqueName: \"kubernetes.io/projected/1cc99ee8-6691-45dc-ba78-853204f37b27-kube-api-access-zhxs4\") pod \"1cc99ee8-6691-45dc-ba78-853204f37b27\" (UID: \"1cc99ee8-6691-45dc-ba78-853204f37b27\") " Apr 21 10:18:27.991029 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:27.991002 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc99ee8-6691-45dc-ba78-853204f37b27-kube-api-access-zhxs4" (OuterVolumeSpecName: "kube-api-access-zhxs4") pod "1cc99ee8-6691-45dc-ba78-853204f37b27" (UID: "1cc99ee8-6691-45dc-ba78-853204f37b27"). InnerVolumeSpecName "kube-api-access-zhxs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:18:28.085682 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.085644 2577 generic.go:358] "Generic (PLEG): container finished" podID="1cc99ee8-6691-45dc-ba78-853204f37b27" containerID="97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6" exitCode=0 Apr 21 10:18:28.085823 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.085694 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-krfcv" Apr 21 10:18:28.085823 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.085705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-krfcv" event={"ID":"1cc99ee8-6691-45dc-ba78-853204f37b27","Type":"ContainerDied","Data":"97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6"} Apr 21 10:18:28.085823 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.085735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-krfcv" event={"ID":"1cc99ee8-6691-45dc-ba78-853204f37b27","Type":"ContainerDied","Data":"39f08957f23128854112c4a960e564b6f7df9123ba78d60dad7cae0420af3f2b"} Apr 21 10:18:28.085823 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.085750 2577 scope.go:117] "RemoveContainer" containerID="97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6" Apr 21 10:18:28.089430 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.089383 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhxs4\" (UniqueName: \"kubernetes.io/projected/1cc99ee8-6691-45dc-ba78-853204f37b27-kube-api-access-zhxs4\") on node \"ip-10-0-132-46.ec2.internal\" DevicePath \"\"" Apr 21 10:18:28.094294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.094276 2577 scope.go:117] "RemoveContainer" containerID="97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6" Apr 21 10:18:28.094562 ip-10-0-132-46 kubenswrapper[2577]: E0421 10:18:28.094541 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6\": container with ID starting with 97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6 not found: ID does not exist" containerID="97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6" Apr 21 10:18:28.094626 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.094573 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6"} err="failed to get container status \"97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6\": rpc error: code = NotFound desc = could not find container \"97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6\": container with ID starting with 97603fc09cdf70f7440649139f5344e4d872f4beb6927305e1298989d2e837f6 not found: ID does not exist" Apr 21 10:18:28.106262 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.106241 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-krfcv"] Apr 21 10:18:28.109777 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.109760 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-krfcv"] Apr 21 10:18:28.203893 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:28.203863 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc99ee8-6691-45dc-ba78-853204f37b27" path="/var/lib/kubelet/pods/1cc99ee8-6691-45dc-ba78-853204f37b27/volumes" Apr 21 10:18:43.020797 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.020749 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-tt5rb"] Apr 21 10:18:43.021253 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.021242 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cc99ee8-6691-45dc-ba78-853204f37b27" containerName="authorino" Apr 21 10:18:43.021294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.021255 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc99ee8-6691-45dc-ba78-853204f37b27" containerName="authorino" Apr 21 10:18:43.021336 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.021325 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cc99ee8-6691-45dc-ba78-853204f37b27" containerName="authorino" Apr 21 10:18:43.024102 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.024087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-tt5rb" Apr 21 10:18:43.026669 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.026642 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 10:18:43.027826 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.027805 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-td8hf\"" Apr 21 10:18:43.027826 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.027817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 10:18:43.029098 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.029076 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tt5rb"] Apr 21 10:18:43.122314 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.122277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhxl\" (UniqueName: \"kubernetes.io/projected/62b92f08-3bfd-4afd-896c-0d83176fc5b1-kube-api-access-rqhxl\") pod \"authorino-674b59b84c-tt5rb\" (UID: \"62b92f08-3bfd-4afd-896c-0d83176fc5b1\") " pod="kuadrant-system/authorino-674b59b84c-tt5rb" Apr 21 10:18:43.222878 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.222850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhxl\" (UniqueName: \"kubernetes.io/projected/62b92f08-3bfd-4afd-896c-0d83176fc5b1-kube-api-access-rqhxl\") pod \"authorino-674b59b84c-tt5rb\" (UID: \"62b92f08-3bfd-4afd-896c-0d83176fc5b1\") " pod="kuadrant-system/authorino-674b59b84c-tt5rb" Apr 21 10:18:43.230573 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.230550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhxl\" (UniqueName: \"kubernetes.io/projected/62b92f08-3bfd-4afd-896c-0d83176fc5b1-kube-api-access-rqhxl\") pod \"authorino-674b59b84c-tt5rb\" (UID: \"62b92f08-3bfd-4afd-896c-0d83176fc5b1\") " pod="kuadrant-system/authorino-674b59b84c-tt5rb" Apr 21 10:18:43.334649 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.334581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-tt5rb" Apr 21 10:18:43.455996 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.455965 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-tt5rb"] Apr 21 10:18:43.459745 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:18:43.459718 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b92f08_3bfd_4afd_896c_0d83176fc5b1.slice/crio-dd8cd31695ea7a1d3a5902e740b3d11e0174b27b12d57c05a62cbcd1395c086a WatchSource:0}: Error finding container dd8cd31695ea7a1d3a5902e740b3d11e0174b27b12d57c05a62cbcd1395c086a: Status 404 returned error can't find the container with id dd8cd31695ea7a1d3a5902e740b3d11e0174b27b12d57c05a62cbcd1395c086a Apr 21 10:18:43.461147 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:43.461128 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:18:44.144163 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:44.144123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-tt5rb" event={"ID":"62b92f08-3bfd-4afd-896c-0d83176fc5b1","Type":"ContainerStarted","Data":"dd8cd31695ea7a1d3a5902e740b3d11e0174b27b12d57c05a62cbcd1395c086a"} Apr 21 10:18:45.149191 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:45.149147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-tt5rb" event={"ID":"62b92f08-3bfd-4afd-896c-0d83176fc5b1","Type":"ContainerStarted","Data":"f3600f017d3a78a41e4e7963d3b73ce9ecae6aa01c607d02d956790d73897757"} Apr 21 10:18:56.139058 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:56.139025 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:18:56.139618 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:56.139601 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:18:56.146860 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:56.146195 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:18:56.148204 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:18:56.148182 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:23:56.165175 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:23:56.165142 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:23:56.166986 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:23:56.166964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:23:56.171353 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:23:56.171333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:23:56.173547 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:23:56.173533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:24:01.921784 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:01.921752 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4bfph_39246080-2914-48cc-ba31-4e0e831a5cfb/istio-proxy/0.log" Apr 21 10:24:01.947880 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:01.947855 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-747657c5d4-nfxrn_1181d1e9-066e-44e5-bc62-d300a81ad7a8/router/0.log" Apr 21 10:24:02.505168 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:02.505138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-674b59b84c-tt5rb_62b92f08-3bfd-4afd-896c-0d83176fc5b1/authorino/0.log" Apr 21 10:24:09.516207 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:09.516177 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jsmds_6bcfc06b-afe8-46d7-ae0e-1eef7cdb631d/global-pull-secret-syncer/0.log" Apr 21 10:24:09.598255 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:09.598226 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cslsr_9a641cbb-c7b6-4574-b609-764377332512/konnectivity-agent/0.log" Apr 21 10:24:09.677266 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:09.677235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-46.ec2.internal_d7f4dde93369cc99ed3e326eef29a265/haproxy/0.log" Apr 21 10:24:12.842094 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:12.842061 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-674b59b84c-tt5rb_62b92f08-3bfd-4afd-896c-0d83176fc5b1/authorino/0.log" Apr 21 10:24:14.391767 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.391691 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tzh5k_d459929e-9c15-4174-a041-b14f3e183024/cluster-monitoring-operator/0.log" Apr 21 10:24:14.518581 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.518551 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-csfpt_fbdea7ca-953d-42cf-a40b-8dcca399d130/monitoring-plugin/0.log" Apr 21 10:24:14.618768 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.618741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vnl9_66ce4f7a-aba6-4b97-8863-86ab1ef171c0/node-exporter/0.log" Apr 21 10:24:14.641404 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.641368 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vnl9_66ce4f7a-aba6-4b97-8863-86ab1ef171c0/kube-rbac-proxy/0.log" Apr 21 10:24:14.662872 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.662821 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vnl9_66ce4f7a-aba6-4b97-8863-86ab1ef171c0/init-textfile/0.log" Apr 21 10:24:14.871837 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.871810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/prometheus/0.log" Apr 21 10:24:14.890855 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.890828 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/config-reloader/0.log" Apr 21 10:24:14.911105 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.911085 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/thanos-sidecar/0.log" Apr 21 10:24:14.933207 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.933185 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/kube-rbac-proxy-web/0.log" Apr 21 10:24:14.954152 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.954132 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/kube-rbac-proxy/0.log" Apr 21 10:24:14.974870 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.974850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/kube-rbac-proxy-thanos/0.log" Apr 21 10:24:14.997168 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:14.997149 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2ebd8ccf-3b5d-4d84-b499-255a5753f609/init-config-reloader/0.log" Apr 21 10:24:15.071231 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.071205 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9p68k_02d899a4-d56f-4804-bca2-78b9cd085a39/prometheus-operator-admission-webhook/0.log" Apr 21 10:24:15.100350 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.100324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7ccbd7dc58-x6k87_650ceee4-df02-41d5-bcd2-65a7241631a1/telemeter-client/0.log" Apr 21 10:24:15.121594 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.121571 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7ccbd7dc58-x6k87_650ceee4-df02-41d5-bcd2-65a7241631a1/reload/0.log" Apr 21 10:24:15.141452 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.141427 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7ccbd7dc58-x6k87_650ceee4-df02-41d5-bcd2-65a7241631a1/kube-rbac-proxy/0.log" Apr 21 10:24:15.180782 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.180751 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4586d5c6-fqbln_7a60bfc1-ed25-4ed1-8e6e-906df625036f/thanos-query/0.log" Apr 21 10:24:15.216346 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.216327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4586d5c6-fqbln_7a60bfc1-ed25-4ed1-8e6e-906df625036f/kube-rbac-proxy-web/0.log" Apr 21 10:24:15.243219 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.243193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4586d5c6-fqbln_7a60bfc1-ed25-4ed1-8e6e-906df625036f/kube-rbac-proxy/0.log" Apr 21 10:24:15.264079 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.264053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4586d5c6-fqbln_7a60bfc1-ed25-4ed1-8e6e-906df625036f/prom-label-proxy/0.log" Apr 21 10:24:15.284129 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.284092 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4586d5c6-fqbln_7a60bfc1-ed25-4ed1-8e6e-906df625036f/kube-rbac-proxy-rules/0.log" Apr 21 10:24:15.306241 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:15.306220 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c4586d5c6-fqbln_7a60bfc1-ed25-4ed1-8e6e-906df625036f/kube-rbac-proxy-metrics/0.log" Apr 21 10:24:16.665165 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:16.665135 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-2kfcz_09b7d935-6954-4915-8391-de3719c71560/networking-console-plugin/0.log" Apr 21 10:24:17.154536 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:17.154505 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/1.log" Apr 21 10:24:17.159501 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:17.159480 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-77wnm_24013a62-41fe-4530-aa6f-3ebb1c0b54cc/console-operator/2.log" Apr 21 10:24:17.621113 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:17.621081 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-l49kt_2bf51414-8294-4d06-a0ae-03141b8cbf38/download-server/0.log" Apr 21 10:24:18.096772 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.096729 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-9fjwn_a30b7cfa-6025-45f6-bc51-c813d60a38ae/volume-data-source-validator/0.log" Apr 21 10:24:18.193018 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.192967 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-tt5rb" podStartSLOduration=334.277403347 podStartE2EDuration="5m35.192951891s" podCreationTimestamp="2026-04-21 10:18:43 +0000 UTC" firstStartedPulling="2026-04-21 10:18:43.461318783 +0000 UTC m=+887.855908036" lastFinishedPulling="2026-04-21 10:18:44.376867327 +0000 UTC m=+888.771456580" observedRunningTime="2026-04-21 10:18:45.164292026 +0000 UTC m=+889.558881298" watchObservedRunningTime="2026-04-21 10:24:18.192951891 +0000 UTC m=+1222.587541159" Apr 21 10:24:18.193267 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.193251 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5"] Apr 21 10:24:18.196772 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.196752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.199142 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.199123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f55k6\"/\"openshift-service-ca.crt\"" Apr 21 10:24:18.199261 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.199192 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f55k6\"/\"kube-root-ca.crt\"" Apr 21 10:24:18.200385 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.200366 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f55k6\"/\"default-dockercfg-nx89h\"" Apr 21 10:24:18.207386 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.207362 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5"] Apr 21 10:24:18.330792 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.330755 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-lib-modules\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.330792 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.330795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-proc\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.331015 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.330834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-podres\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.331015 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.330852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhzgt\" (UniqueName: \"kubernetes.io/projected/86b27c92-0b3b-4652-9269-98ad0475b3ca-kube-api-access-dhzgt\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.331015 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.330876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-sys\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.431905 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.431872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-sys\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.431942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-lib-modules\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.431967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-proc\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.431990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-podres\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.432000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-sys\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432068 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.432006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhzgt\" (UniqueName: \"kubernetes.io/projected/86b27c92-0b3b-4652-9269-98ad0475b3ca-kube-api-access-dhzgt\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432252 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.432084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-proc\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432252 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.432127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-lib-modules\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.432252 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.432135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/86b27c92-0b3b-4652-9269-98ad0475b3ca-podres\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.440735 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.440716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhzgt\" (UniqueName: \"kubernetes.io/projected/86b27c92-0b3b-4652-9269-98ad0475b3ca-kube-api-access-dhzgt\") pod \"perf-node-gather-daemonset-z6mq5\" (UID: \"86b27c92-0b3b-4652-9269-98ad0475b3ca\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.508122 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.508092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:18.629579 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.629551 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5"] Apr 21 10:24:18.631632 ip-10-0-132-46 kubenswrapper[2577]: W0421 10:24:18.631583 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod86b27c92_0b3b_4652_9269_98ad0475b3ca.slice/crio-2cdc86da22231b82e033fa94289449eeba8d0fa9b48d059647401a7108b9e07f WatchSource:0}: Error finding container 2cdc86da22231b82e033fa94289449eeba8d0fa9b48d059647401a7108b9e07f: Status 404 returned error can't find the container with id 2cdc86da22231b82e033fa94289449eeba8d0fa9b48d059647401a7108b9e07f Apr 21 10:24:18.633184 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.633163 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:24:18.865515 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.865484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g62bp_45c00ff2-e16b-4854-9279-a0a6d25f59c8/dns/0.log" Apr 21 10:24:18.885162 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.885138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g62bp_45c00ff2-e16b-4854-9279-a0a6d25f59c8/kube-rbac-proxy/0.log" Apr 21 10:24:18.969845 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:18.969817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c9648_398c1473-0683-4af4-866e-a4c6405244ff/dns-node-resolver/0.log" Apr 21 10:24:19.271619 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:19.271587 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" event={"ID":"86b27c92-0b3b-4652-9269-98ad0475b3ca","Type":"ContainerStarted","Data":"44a928388e486f560e56edc6c7a0c648a5df16d8a8974feefba32f20ac342663"} Apr 21 10:24:19.271619 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:19.271620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" event={"ID":"86b27c92-0b3b-4652-9269-98ad0475b3ca","Type":"ContainerStarted","Data":"2cdc86da22231b82e033fa94289449eeba8d0fa9b48d059647401a7108b9e07f"} Apr 21 10:24:19.272031 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:19.271688 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:19.287572 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:19.287530 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" podStartSLOduration=1.28751725 podStartE2EDuration="1.28751725s" podCreationTimestamp="2026-04-21 10:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:24:19.285860784 +0000 UTC m=+1223.680450054" watchObservedRunningTime="2026-04-21 10:24:19.28751725 +0000 UTC m=+1223.682106541" Apr 21 10:24:19.418048 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:19.418020 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fznzv_1744533a-262f-4150-9f9b-9183b9e8576e/node-ca/0.log" Apr 21 10:24:20.205926 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:20.205894 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4bfph_39246080-2914-48cc-ba31-4e0e831a5cfb/istio-proxy/0.log" Apr 21 10:24:20.232036 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:20.232007 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-747657c5d4-nfxrn_1181d1e9-066e-44e5-bc62-d300a81ad7a8/router/0.log" Apr 21 10:24:20.640624 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:20.640533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p5cg2_f34ed386-407f-400b-a309-9c15bf12db74/serve-healthcheck-canary/0.log" Apr 21 10:24:21.063854 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:21.063819 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-khjfc_6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0/insights-operator/1.log" Apr 21 10:24:21.064140 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:21.064120 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-khjfc_6c2829f4-6e5d-4759-a0ac-3e5e0085f9d0/insights-operator/0.log" Apr 21 10:24:21.143043 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:21.143011 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpjhk_9588b56f-2f02-407d-9d34-92fd50cd0ced/kube-rbac-proxy/0.log" Apr 21 10:24:21.163501 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:21.163479 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpjhk_9588b56f-2f02-407d-9d34-92fd50cd0ced/exporter/0.log" Apr 21 10:24:21.183012 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:21.182990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpjhk_9588b56f-2f02-407d-9d34-92fd50cd0ced/extractor/0.log" Apr 21 10:24:23.285294 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:23.285262 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bbdf94c78-rw82j_7757718f-8e4e-4339-812a-dfa40f1d911f/manager/0.log" Apr 21 10:24:25.285516 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:25.285486 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-z6mq5" Apr 21 10:24:27.229737 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:27.229698 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-c7d6t_b613a503-a4ac-455e-80bf-2ffd14fe2b3d/kube-storage-version-migrator-operator/1.log" Apr 21 10:24:27.230642 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:27.230625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-c7d6t_b613a503-a4ac-455e-80bf-2ffd14fe2b3d/kube-storage-version-migrator-operator/0.log" Apr 21 10:24:28.664197 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.664120 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/kube-multus-additional-cni-plugins/0.log" Apr 21 10:24:28.684301 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.684277 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/egress-router-binary-copy/0.log" Apr 21 10:24:28.703443 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.703417 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/cni-plugins/0.log" Apr 21 10:24:28.727121 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.727098 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/bond-cni-plugin/0.log" Apr 21 10:24:28.748306 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.748281 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/routeoverride-cni/0.log" Apr 21 10:24:28.767525 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.767496 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/whereabouts-cni-bincopy/0.log" Apr 21 10:24:28.787332 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.787310 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz5k6_e099e319-e542-43c2-9f97-e5b95d49e31d/whereabouts-cni/0.log" Apr 21 10:24:28.813729 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.813707 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b24s6_3295da7d-67d3-49fe-887c-1205e6a605d5/kube-multus/0.log" Apr 21 10:24:28.886096 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.886070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7czdf_2ae89f79-2df1-4414-b256-f90091f5fa3c/network-metrics-daemon/0.log" Apr 21 10:24:28.907182 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:28.907160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7czdf_2ae89f79-2df1-4414-b256-f90091f5fa3c/kube-rbac-proxy/0.log" Apr 21 10:24:30.416955 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.416917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-controller/0.log" Apr 21 10:24:30.434156 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.434125 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/0.log" Apr 21 10:24:30.440204 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.440184 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovn-acl-logging/1.log" Apr 21 10:24:30.458968 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.458944 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/kube-rbac-proxy-node/0.log" Apr 21 10:24:30.482116 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.482090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 10:24:30.501576 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.501541 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/northd/0.log" Apr 21 10:24:30.521528 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.521507 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/nbdb/0.log" Apr 21 10:24:30.545923 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.545901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/sbdb/0.log" Apr 21 10:24:30.641075 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:30.641044 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vswh9_100383eb-b81b-458e-9697-d08a4606d57e/ovnkube-controller/0.log" Apr 21 10:24:31.718941 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:31.718913 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mwlr4_fe7a5351-0fba-4368-9b98-1791bc7cfdfc/network-check-target-container/0.log" Apr 21 10:24:32.655062 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:32.655031 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8l6xb_1aa4f54e-36b5-40c5-8faa-641c649d50e7/iptables-alerter/0.log" Apr 21 10:24:33.399633 ip-10-0-132-46 kubenswrapper[2577]: I0421 10:24:33.399602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x9n4l_ecd398ca-3264-4609-b862-e4345b84ce0e/tuned/0.log"